Sample records for formal statistical analysis

  1. An experiment on the impact of a neonicotinoid pesticide on honeybees: the value of a formal analysis of the data.

    PubMed

    Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T

    2017-01-01

    We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.

  2. The Development of Introductory Statistics Students' Informal Inferential Reasoning and Its Relationship to Formal Inferential Reasoning

    ERIC Educational Resources Information Center

    Jacob, Bridgette L.

    2013-01-01

    The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…

  3. Aspects of First Year Statistics Students' Reasoning When Performing Intuitive Analysis of Variance: Effects of Within- and Between-Group Variability

    ERIC Educational Resources Information Center

    Trumpower, David L.

    2015-01-01

    Making inferences about population differences based on samples of data, that is, performing intuitive analysis of variance (IANOVA), is common in everyday life. However, the intuitive reasoning of individuals when making such inferences (even following statistics instruction), often differs from the normative logic of formal statistics. The…

  4. Nonlinear estimation of parameters in biphasic Arrhenius plots.

    PubMed

    Puterman, M L; Hrboticky, N; Innis, S M

    1988-05-01

    This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.

  5. Formative Use of Intuitive Analysis of Variance

    ERIC Educational Resources Information Center

    Trumpower, David L.

    2013-01-01

    Students' informal inferential reasoning (IIR) is often inconsistent with the normative logic underlying formal statistical methods such as Analysis of Variance (ANOVA), even after instruction. In two experiments reported here, student's IIR was assessed using an intuitive ANOVA task at the beginning and end of a statistics course. In both…

  6. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  7. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  8. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  9. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    PubMed

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  10. A formal framework of scenario creation and analysis of extreme hydrological events

    NASA Astrophysics Data System (ADS)

    Lohmann, D.

    2007-12-01

    We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.

  11. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  12. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  13. Effectiveness of groundwater governance structures and institutions in Tanzania

    NASA Astrophysics Data System (ADS)

    Gudaga, J. L.; Kabote, S. J.; Tarimo, A. K. P. R.; Mosha, D. B.; Kashaigili, J. J.

    2018-05-01

    This paper examines effectiveness of groundwater governance structures and institutions in Mbarali District, Mbeya Region. The paper adopts exploratory sequential research design to collect quantitative and qualitative data. A random sample of 90 groundwater users with 50% women was involved in the survey. Descriptive statistics, Kruskal-Wallis H test and Mann-Whitney U test were used to compare the differences in responses between groups, while qualitative data were subjected to content analysis. The results show that the Village Councils and Community Water Supply Organizations (COWSOs) were effective in governing groundwater. The results also show statistical significant difference on the overall extent of effectiveness of the Village Councils in governing groundwater between villages ( P = 0.0001), yet there was no significant difference ( P > 0.05) between male and female responses on the effectiveness of Village Councils, village water committees and COWSOs. The Mann-Whitney U test showed statistical significant difference between male and female responses on effectiveness of formal and informal institutions ( P = 0.0001), such that informal institutions were effective relative to formal institutions. The Kruskal-Wallis H test also showed statistical significant difference ( P ≤ 0.05) on the extent of effectiveness of formal institutions, norms and values between low, medium and high categories. The paper concludes that COWSOs were more effective in governing groundwater than other groundwater governance structures. Similarly, norms and values were more effective than formal institutions. The paper recommends sensitization and awareness creation on formal institutions so that they can influence water users' behaviour to govern groundwater.

  14. A Comparative Study of Pre-Service Education for Preschool Teachers in China and the United States

    ERIC Educational Resources Information Center

    Gong, Xin; Wang, Pengcheng

    2017-01-01

    This study provides a comparative analysis of the pre-service education system for preschool educators in China and the United States. Based on collected data and materials (literature, policy documents, and statistical data), we compare two areas of pre-service training: (1) the formal system; (2) the informal system. In the formal system, most…

  15. A Random Variable Approach to Nuclear Targeting and Survivability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Undem, Halvor A.

    We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less

  16. Used battery collection in central Mexico: metal content, legislative/management situation and statistical analysis.

    PubMed

    Guevara-García, José Antonio; Montiel-Corona, Virginia

    2012-03-01

    A statistical analysis of a used battery collection campaign in the state of Tlaxcala, Mexico, is presented. This included a study of the metal composition of spent batteries from formal and informal markets, and a critical discussion about the management of spent batteries in Mexico with respect to legislation. A six-month collection campaign was statistically analyzed: 77% of the battery types were "AA" and 30% of the batteries were from the informal market. A substantial percentage (36%) of batteries had residual voltage in the range 1.2-1.4 V, and 70% had more than 1.0 V; this may reflect underutilization. Metal content analysis and recovery experiments were performed with the five formal and four more frequent informal trademarks. The analysis of Hg, Cd and Pb showed there is no significant difference in content between formal and informal commercialized batteries. All of the analyzed trademarks were under the permissible limit levels of the proposed Mexican Official Norm (NOM) NMX-AA-104-SCFI-2006 and would be classified as not dangerous residues (can be thrown to the domestic rubbish); however, compared with the EU directive 2006/66/EC, 8 out of 9 of the selected battery trademarks would be rejected, since the Mexican Norm content limit is 20, 7.5 and 5 fold higher in Hg, Cd and Pb, respectively, than the EU directive. These results outline the necessity for better regulatory criteria in the proposed Mexican NOM in order to minimize the impact on human health and the environment of this type of residues. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. Determining the Number of Component Clusters in the Standard Multivariate Normal Mixture Model Using Model-Selection Criteria.

    DTIC Science & Technology

    1983-06-16

    has been advocated by Gnanadesikan and 𔃾ilk (1969), and others in the literature. This suggests that, if we use the formal signficance test type...American Statistical Asso., 62, 1159-1178. Gnanadesikan , R., and Wilk, M..B. (1969). Data Analytic Methods in Multi- variate Statistical Analysis. In

  18. The boundary is mixed

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo

    2017-08-01

    We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.

  19. LSD Now: 1973

    ERIC Educational Resources Information Center

    Chunko, John A.

    1973-01-01

    LSD NOW is a nationwide, statistical survey and analysis of hallucinogenic drug use by individuals presently in formal educational surroundings. Analysis, concentrating on the extent and rationale related to the use of such drugs, now offers a deeper and more meaningful understanding of a particular facet of the drug culture. This understanding…

  20. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Öztürk, Hande; Noyan, I. Cevdet

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  1. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE PAGES

    Öztürk, Hande; Noyan, I. Cevdet

    2017-08-24

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  2. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  3. Formal Operations and Learning Style Predict Success in Statistics and Computer Science Courses.

    ERIC Educational Resources Information Center

    Hudak, Mary A.; Anderson, David E.

    1990-01-01

    Studies 94 undergraduate students in introductory statistics and computer science courses. Applies Formal Operations Reasoning Test (FORT) and Kolb's Learning Style Inventory (LSI). Finds that substantial numbers of students have not achieved the formal operation level of cognitive maturity. Emphasizes need to examine students learning style and…

  4. Differential gene expression detection and sample classification using penalized linear regression models.

    PubMed

    Wu, Baolin

    2006-02-15

    Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.

  5. Quality assurance software inspections at NASA Ames: Metrics for feedback and modification

    NASA Technical Reports Server (NTRS)

    Wenneson, G.

    1985-01-01

    Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.

  6. Probability density function formalism for optical coherence tomography signal analysis: a controlled phantom study.

    PubMed

    Weatherbee, Andrew; Sugita, Mitsuro; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex

    2016-06-15

    The distribution of backscattered intensities as described by the probability density function (PDF) of tissue-scattered light contains information that may be useful for tissue assessment and diagnosis, including characterization of its pathology. In this Letter, we examine the PDF description of the light scattering statistics in a well characterized tissue-like particulate medium using optical coherence tomography (OCT). It is shown that for low scatterer density, the governing statistics depart considerably from a Gaussian description and follow the K distribution for both OCT amplitude and intensity. The PDF formalism is shown to be independent of the scatterer flow conditions; this is expected from theory, and suggests robustness and motion independence of the OCT amplitude (and OCT intensity) PDF metrics in the context of potential biomedical applications.

  7. A brief history of numbers and statistics with cytometric applications.

    PubMed

    Watson, J V

    2001-02-15

    A brief history of numbers and statistics traces the development of numbers from prehistory to completion of our current system of numeration with the introduction of the decimal fraction by Viete, Stevin, Burgi, and Galileo at the turn of the 16th century. This was followed by the development of what we now know as probability theory by Pascal, Fermat, and Huygens in the mid-17th century which arose in connection with questions in gambling with dice and can be regarded as the origin of statistics. The three main probability distributions on which statistics depend were introduced and/or formalized between the mid-17th and early 19th centuries: the binomial distribution by Pascal; the normal distribution by de Moivre, Gauss, and Laplace, and the Poisson distribution by Poisson. The formal discipline of statistics commenced with the works of Pearson, Yule, and Gosset at the turn of the 19th century when the first statistical tests were introduced. Elementary descriptions of the statistical tests most likely to be used in conjunction with cytometric data are given and it is shown how these can be applied to the analysis of difficult immunofluorescence distributions when there is overlap between the labeled and unlabeled cell populations. Copyright 2001 Wiley-Liss, Inc.

  8. The Evolution of Organization Analysis in ASQ, 1959-1979.

    ERIC Educational Resources Information Center

    Daft, Richard L.

    1980-01-01

    During the period 1959-1979, a sharp trend toward low-variety statistical languages has taken place, which may represent an organizational mapping phase in which simple, quantifiable relationships have been formally defined and measured. A broader scope of research languages will be needed in the future. (Author/IRT)

  9. Not so Fast My Friend: The Rush to R and the Need for Rigorous Evaluation of Data Analysis and Software in Education

    ERIC Educational Resources Information Center

    Harwell, Michael

    2014-01-01

    Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…

  10. Programs for Children with Specific Learning Disabilities. P.L. 91-230, Title VI-G Formal Final Evaluation. (Statistical Analysis of Data).

    ERIC Educational Resources Information Center

    Murphy, Philip J.

    The paper reports the final evaluation of a program for approximately 143 learning disabled (LD) students (grades 6-to-12) from six school districts. A number of test instruments were used to evaluate student progress during the program, including the Wide Range Achievement Test (WRAT), the Durrell Analysis of Reading Difficulty, and the…

  11. An analysis on intersectional collaboration on non-communicable chronic disease prevention and control in China: a cross-sectional survey on main officials of community health service institutions.

    PubMed

    Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu

    2017-11-10

    Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management institutions and general hospitals are more active in participating in community NCD management with better collaboration score, whereas the CDC shows relatively poor collaboration in China. Xing-ming Li and Alon Rasooly have the same contribution to the paper. Xing-ming Li and Alon Rasooly listed as the same first author.

  12. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  13. Twenty-five years of maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  14. Cost implications of organizing nursing home workforce in teams.

    PubMed

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-08-01

    To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs.

  15. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  16. The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data

    ERIC Educational Resources Information Center

    Gregory, Kelvin

    2006-01-01

    The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…

  17. Introduction of Digital Storytelling in Preschool Education: A Case Study from Croatia

    ERIC Educational Resources Information Center

    Preradovic, Nives Mikelic; Lesin, Gordana; Boras, Damir

    2016-01-01

    Our case study from Croatia showed the benefits of digital storytelling in a preschool as a basis for the formal ICT education. The statistical analysis revealed significant differences between children aged 6-7 who learned mathematics by traditional storytelling compared to those learning through digital storytelling. The experimental group that…

  18. The predictive ability of the CHADS2 and CHA2DS2-VASc scores for bleeding risk in atrial fibrillation: the MAQI(2) experience.

    PubMed

    Barnes, Geoffrey D; Gu, Xiaokui; Haymart, Brian; Kline-Rogers, Eva; Almany, Steve; Kozlowski, Jay; Besley, Dennis; Krol, Gregory D; Froehlich, James B; Kaatz, Scott

    2014-08-01

    Guidelines recommend the assessment of stroke and bleeding risk before initiating warfarin anticoagulation in patients with atrial fibrillation. Many of the elements used to predict stroke also overlap with bleeding risk in atrial fibrillation patients and it is tempting to use stroke risk scores to efficiently estimate bleeding risk. Comparison of stroke risk scores to bleeding risk scores to predict bleeding has not been thoroughly assessed. 2600 patients followed at seven anticoagulation clinics were followed from October 2009-May 2013. Five risk models (CHADS2, CHA2DS2-VASc, HEMORR2HAGES, HAS-BLED and ATRIA) were retrospectively applied to each patient. The primary outcome was the first major bleeding event. Area under the ROC curves were compared with C statistic and net reclassification improvement (NRI) analysis was performed. 110 patients experienced a major bleeding event in 2581.6 patient-years (4.5%/year). Mean follow up was 1.0±0.8years. All of the formal bleeding risk scores had a modest predictive value for first major bleeding events (C statistic 0.66-0.69), performing better than CHADS2 and CHA2DS2-VASc scores (C statistic difference 0.10 - 0.16). NRI analysis demonstrated a 52-69% and 47-64% improvement of the formal bleeding risk scores over the CHADS2 score and CHA2DS2-VASc score, respectively. The CHADS2 and CHA2DS2-VASc scores did not perform as well as formal bleeding risk scores for prediction of major bleeding in non-valvular atrial fibrillation patients treated with warfarin. All three bleeding risk scores (HAS-BLED, ATRIA and HEMORR2HAGES) performed moderately well. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Bedside Ultrasound in the Emergency Department to Detect Hydronephrosis for the Evaluation of Suspected Ureteric Colic.

    PubMed

    Shrestha, R; Shakya, R M; Khan A, A

    2016-01-01

    Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.

  20. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  1. Using Informal Inferential Reasoning to Develop Formal Concepts: Analyzing an Activity

    ERIC Educational Resources Information Center

    Weinberg, Aaron; Wiesner, Emilie; Pfaff, Thomas J.

    2010-01-01

    Inferential reasoning is a central component of statistics. Researchers have suggested that students should develop an informal understanding of the ideas that underlie inference before learning the concepts formally. This paper presents a hands-on activity that is designed to help students in an introductory statistics course draw informal…

  2. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    ERIC Educational Resources Information Center

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-01-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…

  3. Statistical analysis of Turbine Engine Diagnostic (TED) field test data

    NASA Astrophysics Data System (ADS)

    Taylor, Malcolm S.; Monyak, John T.

    1994-11-01

    During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.

  4. Does Formal Research Training Lead to Academic Success in Plastic Surgery? A Comprehensive Analysis of U.S. Academic Plastic Surgeons.

    PubMed

    Lopez, Joseph; Ameri, Afshin; Susarla, Srinivas M; Reddy, Sashank; Soni, Ashwin; Tong, J W; Amini, Neda; Ahmed, Rizwan; May, James W; Lee, W P Andrew; Dorafshar, Amir

    2016-01-01

    It is currently unknown whether formal research training has an influence on academic advancement in plastic surgery. The purpose of this study was to determine whether formal research training was associated with higher research productivity, academic rank, and procurement of extramural National Institutes of Health (NIH) funding in plastic surgery, comparing academic surgeons who completed said research training with those without. This was a cross-sectional study of full-time academic plastic surgeons in the United States. The main predictor variable was formal research training, defined as completion of a postdoctoral research fellowship or attainment of a Doctor of Philosophy (PhD). The primary outcome was scientific productivity measured by the Hirsh-index (h-index, the number of publications, h that have at least h citations each). The secondary outcomes were academic rank and NIH funding. Descriptive, bivariate, and multiple regression statistics were computed. A total of 607 academic surgeons were identified from 94 Accreditation Council for Graduate Medical Education-accredited plastic surgery training programs. In all, 179 (29.5%) surgeons completed formal research training. The mean h-index was 11.7 ± 9.9. And, 58 (9.6%) surgeons successfully procured NIH funding. The distribution of academic rank was the following: endowed professor (5.4%), professor (23.9%), associate professor (23.4%), assistant professor (46.0%), and instructor (1.3%). In a multiple regression analysis, completion of formal research training was significantly predictive of a higher h-index and successful procurement of NIH funding. Current evidence demonstrates that formal research training is associated with higher scientific productivity and increased likelihood of future NIH funding. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. How Can We Enhance Enjoyment of Secondary School? The Student View

    ERIC Educational Resources Information Center

    Gorard, Stephen; See, Beng Huat

    2011-01-01

    This paper considers enjoyment of formal education for young people aged 14 to 16, largely from their own perspective, based on the view of around 3000 students in England. The data include documentary analysis, official statistics, interviews and surveys with staff and students. Enjoyment of school tends to be promoted by factors such as…

  6. Empirical and Genealogical Analysis of Non-Vocational Adult Education in Europe

    ERIC Educational Resources Information Center

    Manninen, Jyri

    2017-01-01

    Non-formal, non-vocational adult education (NFNVAE) is a low-cost, low-threshold learning activity that generates many benefits for individuals and society, and it should play a more central role in educational policy. NFNVAE's challenge is that it lacks clear concepts and definitions and is, therefore, less systematically covered in statistics,…

  7. Peer Coaching as an Institutionalised Tool for Professional Development: The Perceptions of Tutors in a Nigerian College

    ERIC Educational Resources Information Center

    Aderibigbe, Semiyu Adejare; Ajasa, Folorunso Adekemi

    2013-01-01

    Purpose: The purpose of this paper is to explore the perceptions of college tutors on peer coaching as a tool for professional development to determine its formal institutionalisation. Design/methodology/approach: A survey questionnaire was used for data collection, while analysis of data was done using descriptive statistics. Findings: The…

  8. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  9. Statistical science: a grammar for research.

    PubMed

    Cox, David R

    2017-06-01

    I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.

  10. Cost Implications of Organizing Nursing Home Workforce in Teams

    PubMed Central

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-01-01

    Objective To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Data Sources/Study Setting Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. Study Design A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Data Collection Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Principal Findings Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Conclusions Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs. PMID:19486181

  11. Small sample estimation of the reliability function for technical products

    NASA Astrophysics Data System (ADS)

    Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.

    2017-12-01

    It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.

  12. Assessment of long-term impact of formal certified cardiopulmonary resuscitation training program among nurses

    PubMed Central

    Saramma, P. P.; Raj, L. Suja; Dash, P. K.; Sarma, P. S.

    2016-01-01

    Context: Cardiopulmonary resuscitation (CPR) and emergency cardiovascular care guidelines are periodically renewed and published by the American Heart Association. Formal training programs are conducted based on these guidelines. Despite widespread training CPR is often poorly performed. Hospital educators spend a significant amount of time and money in training health professionals and maintaining basic life support (BLS) and advanced cardiac life support (ACLS) skills among them. However, very little data are available in the literature highlighting the long-term impact of these training. Aims: To evaluate the impact of formal certified CPR training program on the knowledge and skill of CPR among nurses, to identify self-reported outcomes of attempted CPR and training needs of nurses. Setting and Design: Tertiary care hospital, Prospective, repeated-measures design. Subjects and Methods: A series of certified BLS and ACLS training programs were conducted during 2010 and 2011. Written and practical performance tests were done. Final testing was undertaken 3–4 years after training. The sample included all available, willing CPR certified nurses and experience matched CPR noncertified nurses. Statistical Analysis Used: SPSS for Windows version 21.0. Results: The majority of the 206 nurses (93 CPR certified and 113 noncertified) were females. There was a statistically significant increase in mean knowledge level and overall performance before and after the formal certified CPR training program (P = 0.000). However, the mean knowledge scores were equivalent among the CPR certified and noncertified nurses, although the certified nurses scored a higher mean score (P = 0.140). Conclusions: Formal certified CPR training program increases CPR knowledge and skill. However, significant long-term effects could not be found. There is a need for regular and periodic recertification. PMID:27303137

  13. Statistics of Smoothed Cosmic Fields in Perturbation Theory. I. Formulation and Useful Formulae in Second-Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Matsubara, Takahiko

    2003-02-01

    We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.

  14. Granular statistical mechanics - Building on the legacy of Sir Sam Edwards

    NASA Astrophysics Data System (ADS)

    Blumenfeld, Raphael

    When Sir Sam Edwards laid down the foundations for the statistical mechanics of jammed granular materials he opened a new field in soft condensed matter and many followed. In this presentation we review briefly the Edwards formalism and some of its less discussed consequences. We point out that the formalism is useful for other classes of systems - cellular and porous materials. A certain shortcoming of the original formalism is then discussed and a modification to overcome it is proposed. Finally, a derivation of an equation of state with the new formalism is presented; the equation of state is analogous to the PVT relation for thermal gases, relating the volume, the boundary stress and measures of the structural and stress fluctuations. NUDT, Changsha, China, Imperial College London, UK, Cambridge University, UK.

  15. Participation Trends and Patterns in Adult Education: 1991-1999. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Creighton, Sean; Hudson, Lisa

    Participation of U.S. adults in formal learning activities during the 1990s was examined by analyzing data from the 1991, 1995, and 1999 Adult Education Surveys that were part of the National Household Education Surveys Program. Overall, participation in adult education between 1991 and 1999 increased among all but one age group (35-44 years), all…

  16. Geomatic Methods for the Analysis of Data in the Earth Sciences: Lecture Notes in Earth Sciences, Vol. 95

    NASA Astrophysics Data System (ADS)

    Pavlis, Nikolaos K.

    Geomatics is a trendy term that has been used in recent years to describe academic departments that teach and research theories, methods, algorithms, and practices used in processing and analyzing data related to the Earth and other planets. Naming trends aside, geomatics could be considered as the mathematical and statistical “toolbox” that allows Earth scientists to extract information about physically relevant parameters from the available data and accompany such information with some measure of its reliability. This book is an attempt to present the mathematical-statistical methods used in data analysis within various disciplines—geodesy, geophysics, photogrammetry and remote sensing—from a unifying perspective that inverse problem formalism permits. At the same time, it allows us to stretch the relevance of statistical methods in achieving an optimal solution.

  17. Social and Spill-Over Benefits as Motivating Factors to Investment in Formal Education in Africa: A Reflection around Ghanaian, Kenyan and Rwandan Contexts

    ERIC Educational Resources Information Center

    Ampofo, S. Y.; Bizimana, B.; Ndayambaje, I.; Karongo, V.; Lawrence, K. Lyn; Orodho, J. A.

    2015-01-01

    This study examined the social and spill-over benefits as motivating factors to investment in formal education in selected countries in Africa. The paper had three objectives, namely) to profile the key statistics of formal schooling; ii) examine the formal education and iii) link national goals of education with expectations in Ghana, Kenya and…

  18. Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism

    NASA Astrophysics Data System (ADS)

    Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor

    2018-02-01

    Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.

  19. Great Computational Intelligence in the Formal Sciences via Analogical Reasoning

    DTIC Science & Technology

    2017-05-08

    computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational

  20. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  1. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  2. Studies in Non-Equilibrium Statistical Mechanics.

    DTIC Science & Technology

    1982-09-01

    in the formalism, and this is used to simulate the effects of rotational states and collisions. At each stochastic step the energy changes in the...uses of this method. 10. A Scaling Theoretical Analysis of Vibrational Relaxation Experiments: Rotational Effects and Long-Range Collisions 0...in- elude rotational effects through the rotational energy gaps and the rotational distributions. The variables in this theory are a fundamental set

  3. Labor Force Participation in Formal Work-Related Education in 2000-01. Statistical Analysis Report. NCES 2005-048

    ERIC Educational Resources Information Center

    Hudson, Lisa; Bhandari, Rajika; Peter, Katharin; Bills, David B.

    2005-01-01

    Of the many purposes education serves in society, one of the most important is to prepare people for work. In today's economy, education is important not just to help adults enter the labor market, but also to ensure that adults remain marketable throughout their working lives. This report examines how adults in the labor force use formal…

  4. Annotating spatio-temporal datasets for meaningful analysis in the Web

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Pebesma, Edzer; Scheider, Simon

    2014-05-01

    More and more environmental datasets that vary in space and time are available in the Web. This comes along with an advantage of using the data for other purposes than originally foreseen, but also with the danger that users may apply inappropriate analysis procedures due to lack of important assumptions made during the data collection process. In order to guide towards a meaningful (statistical) analysis of spatio-temporal datasets available in the Web, we have developed a Higher-Order-Logic formalism that captures some relevant assumptions in our previous work [1]. It allows to proof on meaningful spatial prediction and aggregation in a semi-automated fashion. In this poster presentation, we will present a concept for annotating spatio-temporal datasets available in the Web with concepts defined in our formalism. Therefore, we have defined a subset of the formalism as a Web Ontology Language (OWL) pattern. It allows capturing the distinction between the different spatio-temporal variable types, i.e. point patterns, fields, lattices and trajectories, that in turn determine whether a particular dataset can be interpolated or aggregated in a meaningful way using a certain procedure. The actual annotations that link spatio-temporal datasets with the concepts in the ontology pattern are provided as Linked Data. In order to allow data producers to add the annotations to their datasets, we have implemented a Web portal that uses a triple store at the backend to store the annotations and to make them available in the Linked Data cloud. Furthermore, we have implemented functions in the statistical environment R to retrieve the RDF annotations and, based on these annotations, to support a stronger typing of spatio-temporal datatypes guiding towards a meaningful analysis in R. [1] Stasch, C., Scheider, S., Pebesma, E., Kuhn, W. (2014): "Meaningful spatial prediction and aggregation", Environmental Modelling & Software, 51, 149-165.

  5. Discrete Mathematical Approaches to Graph-Based Traffic Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Cowley, Wendy E.; Hogan, Emilie A.

    2014-04-01

    Modern cyber defense and anlaytics requires general, formal models of cyber systems. Multi-scale network models are prime candidates for such formalisms, using discrete mathematical methods based in hierarchically-structured directed multigraphs which also include rich sets of labels. An exemplar of an application of such an approach is traffic analysis, that is, observing and analyzing connections between clients, servers, hosts, and actors within IP networks, over time, to identify characteristic or suspicious patterns. Towards that end, NetFlow (or more generically, IPFLOW) data are available from routers and servers which summarize coherent groups of IP packets flowing through the network. In thismore » paper, we consider traffic analysis of Netflow using both basic graph statistics and two new mathematical measures involving labeled degree distributions and time interval overlap measures. We do all of this over the VAST test data set of 96M synthetic Netflow graph edges, against which we can identify characteristic patterns of simulated ground-truth network attacks.« less

  6. Statistical analysis of the determinations of the Sun's Galactocentric distance

    NASA Astrophysics Data System (ADS)

    Malkin, Zinovy

    2013-02-01

    Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.

  7. The Influence of 16-year-old Students' Gender, Mental Abilities, and Motivation on their Reading and Drawing Submicrorepresentations Achievements

    NASA Astrophysics Data System (ADS)

    Devetak, Iztok; Aleksij Glažar, Saša

    2010-08-01

    Submicrorepresentations (SMRs) are a powerful tool for identifying misconceptions of chemical concepts and for generating proper mental models of chemical phenomena in students' long-term memory during chemical education. The main purpose of the study was to determine which independent variables (gender, formal reasoning abilities, visualization abilities, and intrinsic motivation for learning chemistry) have the maximum influence on students' reading and drawing SMRs. A total of 386 secondary school students (aged 16.3 years) participated in the study. The instruments used in the study were: test of Chemical Knowledge, Test of Logical Thinking, two tests of visualization abilities Patterns and Rotations, and questionnaire on Intrinsic Motivation for Learning Science. The results show moderate, but statistically significant correlations between students' intrinsic motivation, formal reasoning abilities and chemical knowledge at submicroscopic level based on reading and drawing SMRs. Visualization abilities are not statistically significantly correlated with students' success on items that comprise reading or drawing SMRs. It can be also concluded that there is a statistically significant difference between male and female students in solving problems that include reading or drawing SMRs. Based on these statistical results and content analysis of the sample problems, several educational strategies can be implemented for students to develop adequate mental models of chemical concepts on all three levels of representations.

  8. The inner mass power spectrum of galaxies using strong gravitational lensing: beyond linear approximation

    NASA Astrophysics Data System (ADS)

    Chatterjee, Saikat; Koopmans, Léon V. E.

    2018-02-01

    In the last decade, the detection of individual massive dark matter sub-haloes has been possible using potential correction formalism in strong gravitational lens imaging. Here, we propose a statistical formalism to relate strong gravitational lens surface brightness anomalies to the lens potential fluctuations arising from dark matter distribution in the lens galaxy. We consider these fluctuations as a Gaussian random field in addition to the unperturbed smooth lens model. This is very similar to weak lensing formalism and we show that in this way we can measure the power spectrum of these perturbations to the potential. We test the method by applying it to simulated mock lenses of different geometries and by performing an MCMC analysis of the theoretical power spectra. This method can measure density fluctuations in early type galaxies on scales of 1-10 kpc at typical rms levels of a per cent, using a single lens system observed with the Hubble Space Telescope with typical signal-to-noise ratios obtained in a single orbit.

  9. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  10. Illicit and pharmaceutical drug consumption estimated via wastewater analysis. Part B: placing back-calculations in a formal statistical framework.

    PubMed

    Jones, Hayley E; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J; Baker, David R; Ades, A E

    2014-07-15

    Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these 'back-calculations', the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. Copyright © 2014. Published by Elsevier B.V.

  11. Illicit and pharmaceutical drug consumption estimated via wastewater analysis. Part B: Placing back-calculations in a formal statistical framework

    PubMed Central

    Jones, Hayley E.; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J.; Baker, David R.; Ades, A.E.

    2014-01-01

    Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these ‘back-calculations’, the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. PMID:24636801

  12. The Schrödinger–Langevin equation with and without thermal fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, R., E-mail: roland.katz@subatech.in2p3.fr; Gossiaux, P.B., E-mail: Pol-Bernard.Gossiaux@subatech.in2p3.fr

    2016-05-15

    The Schrödinger–Langevin equation (SLE) is considered as an effective open quantum system formalism suitable for phenomenological applications involving a quantum subsystem interacting with a thermal bath. We focus on two open issues relative to its solutions: the stationarity of the excited states of the non-interacting subsystem when one considers the dissipation only and the thermal relaxation toward asymptotic distributions with the additional stochastic term. We first show that a proper application of the Madelung/polar transformation of the wave function leads to a non zero damping of the excited states of the quantum subsystem. We then study analytically and numerically themore » SLE ability to bring a quantum subsystem to the thermal equilibrium of statistical mechanics. To do so, concepts about statistical mixed states and quantum noises are discussed and a detailed analysis is carried with two kinds of noise and potential. We show that within our assumptions the use of the SLE as an effective open quantum system formalism is possible and discuss some of its limitations.« less

  13. Spatial analysis on future housing markets: economic development and housing implications.

    PubMed

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.

  14. Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications

    PubMed Central

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097

  15. Variation in reaction norms: Statistical considerations and biological interpretation.

    PubMed

    Morrissey, Michael B; Liefting, Maartje

    2016-09-01

    Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  16. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  17. The redefinition of the familialist home care model in France: the complex formalization of care through cash payment.

    PubMed

    Le Bihan, Blanche

    2012-05-01

    This article investigates the impact of policy measures on the organisation of home-based care for older people in France, by examining the balance between formal and informal care and the redefinition of the initial familialist model. It focuses on the specific cash for care scheme (the Allocation personnalisée d'autonomie - Personalised allowance for autonomy) which is at the core of the French home-based care policy. The author argues that in a redefined context of 'welfare mix', the French public strategy for supporting home-based care in France is articulated around two major objectives, which can appear contradictory. It aims to formalise a professional care sector, with respect to the employment policy while allowing the development of new forms of informal care, which cannot be considered to be formal employment. The data collection is two-fold. Firstly, a detailed analysis was made of different policy documents and public reports, together with a systematic review of existing studies. Secondly, statistical analysis on home-based care resources were collected, which was not easy, as home-care services for older people in France are part of a larger sector of activity, 'personal services' (services à la personne). The article exposes three main findings. First, it highlights the complexity of the formalisation process related to the introduction of the French care allowance and demonstrates that formalisation, which facilitates the recognition of care as work, does not necessarily mean professionalisation. Second, it outlines the diversity of the resources available: heterogeneous professional care, semi-formal forms of care work with the possibility to employ a relative and informal family care. Finally, the analysis outlines the importance of the regulation of cash payments on the reshaping of formal and informal care and comments on its impact on the redefinition of informal caring activities. © 2012 Blackwell Publishing Ltd.

  18. Master equation theory applied to the redistribution of polarized radiation in the weak radiation field limit. V. The two-term atom

    NASA Astrophysics Data System (ADS)

    Bommier, Véronique

    2017-11-01

    Context. In previous papers of this series, we presented a formalism able to account for both statistical equilibrium of a multilevel atom and coherent and incoherent scatterings (partial redistribution). Aims: This paper provides theoretical expressions of the redistribution function for the two-term atom. This redistribution function includes both coherent (RII) and incoherent (RIII) scattering contributions with their branching ratios. Methods: The expressions were derived by applying the formalism outlined above. The statistical equilibrium equation for the atomic density matrix is first formally solved in the case of the two-term atom with unpolarized and infinitely sharp lower levels. Then the redistribution function is derived by substituting this solution for the expression of the emissivity. Results: Expressions are provided for both magnetic and non-magnetic cases. Atomic fine structure is taken into account. Expressions are also separately provided under zero and non-zero hyperfine structure. Conclusions: Redistribution functions are widely used in radiative transfer codes. In our formulation, collisional transitions between Zeeman sublevels within an atomic level (depolarizing collisions effect) are taken into account when possible (I.e., in the non-magnetic case). However, the need for a formal solution of the statistical equilibrium as a preliminary step prevents us from taking into account collisional transfers between the levels of the upper term. Accounting for these collisional transfers could be done via a numerical solution of the statistical equilibrium equation system.

  19. Influence of Culture on Secondary School Students' Understanding of Statistics: A Fijian Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2014-01-01

    Although we use statistical notions daily in making decisions, research in statistics education has focused mostly on formal statistics. Further, everyday culture may influence informal ideas of statistics. Yet, there appears to be minimal literature that deals with the educational implications of the role of culture. This paper will discuss the…

  20. An astronomer's guide to period searching

    NASA Astrophysics Data System (ADS)

    Schwarzenberg-Czerny, A.

    2003-03-01

    We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.

  1. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  2. University of California Conference on Statistical Mechanics (4th) Held March 26-28, 1990

    DTIC Science & Technology

    1990-03-28

    and S. Lago, Chem. Phys., Z, 5750 (1983) Shear Viscosity Calculation via Equilibrium Molecular Dynamics: Einstenian vs. Green - Kubo Formalism by Adel A...through the application of the Green - Kubo approach. Although the theoretical equivalence between both formalisms was demonstrated by Helfand [3], their...like equations and of different expressions based on the Green - Kubo formalism. In contrast to Hoheisel and Vogelsang’s conclusions [2], we find that

  3. Statistical mechanics of few-particle systems: exact results for two useful models

    NASA Astrophysics Data System (ADS)

    Miranda, Enrique N.

    2017-11-01

    The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.

  4. Extending Working Life: Which Competencies are Crucial in Near-Retirement Age?

    PubMed

    Wiktorowicz, Justyna

    2018-01-01

    Nowadays, one of the most important economic and social phenomena is population ageing. Due to the low activity rate of older people, one of the most important challenges is to take various actions involving active ageing, which is supposed to extending working life, and along with it-improve the competencies of older people. The aim of this paper is to evaluate the relevance of different competencies for extending working life, with limiting the analysis for Poland. The paper also assesses the competencies of mature Polish people (aged 50+, but still in working age). In the statistical analysis, I used logistic regression, as well as descriptive statistics and appropriate statistical tests. The results show that among the actions aimed at extending working life, the most important are those related to lifelong learning, targeted at improving the competencies of the older generation. The competencies (both soft and hard) of people aged 50+ are more important than their formal education.

  5. In Search of Rationality: The Purposes behind the Use of Formal Analysis in Organizations.

    ERIC Educational Resources Information Center

    Langley, Ann

    1989-01-01

    Examines how formal analysis is actually practiced in 3 different organizations. Identifies 4 main groups of purposes for formal analysis and relates them to various hierarchical relationships. Formal analysis and social interaction seem inextricably linked in organizational decision-making. Different structural configurations may generate…

  6. A New Statistic for Evaluating Item Response Theory Models for Ordinal Data. CRESST Report 839

    ERIC Educational Resources Information Center

    Cai, Li; Monroe, Scott

    2014-01-01

    We propose a new limited-information goodness of fit test statistic C[subscript 2] for ordinal IRT models. The construction of the new statistic lies formally between the M[subscript 2] statistic of Maydeu-Olivares and Joe (2006), which utilizes first and second order marginal probabilities, and the M*[subscript 2] statistic of Cai and Hansen…

  7. Statistical analysis plan of the head position in acute ischemic stroke trial pilot (HEADPOST pilot).

    PubMed

    Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M

    2017-02-01

    Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.

  8. Sequential Least-Squares Using Orthogonal Transformations. [spacecraft communication/spacecraft tracking-data smoothing

    NASA Technical Reports Server (NTRS)

    Bierman, G. J.

    1975-01-01

    Square root information estimation, starting from its beginnings in least-squares parameter estimation, is considered. Special attention is devoted to discussions of sensitivity and perturbation matrices, computed solutions and their formal statistics, consider-parameters and consider-covariances, and the effects of a priori statistics. The constant-parameter model is extended to include time-varying parameters and process noise, and the error analysis capabilities are generalized. Efficient and elegant smoothing results are obtained as easy consequences of the filter formulation. The value of the techniques is demonstrated by the navigation results that were obtained for the Mariner Venus-Mercury (Mariner 10) multiple-planetary space probe and for the Viking Mars space mission.

  9. Analysis of Yb3+/Er3+-codoped microring resonator cross-grid matrices

    NASA Astrophysics Data System (ADS)

    Vallés, Juan A.; Gǎlǎtuş, Ramona

    2014-09-01

    An analytic model of the scattering response of a highly Yb3+/Er3+-codoped phosphate glass microring resonator matrix is considered to obtain the transfer functions of an M x N cross-grid microring resonator structure. Then a detailed model is used to calculate the pump and signal propagation, including a microscopic statistical formalism to describe the high-concentration induced energy-transfer mechanisms and passive and active features are combined to realistically simulate the performance as a wavelength-selective amplifier or laser. This analysis allows the optimization of these structures for telecom or sensing applications.

  10. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  11. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  12. Preferences for and Barriers to Formal and Informal Athletic Training Continuing Education Activities

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2011-01-01

    Context: Our previous research determined the frequency of participation and perceived effect of formal and informal continuing education (CE) activities. However, actual preferences for and barriers to CE must be characterized. Objective: To determine the types of formal and informal CE activities preferred by athletic trainers (ATs) and barriers to their participation in these activities. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographically stratified random sample of 1000 ATs, 427 ATs (42.7%) completed the survey. Main Outcome Measure(s): As part of a larger study, the Survey of Formal and Informal Athletic Training Continuing Education Activities (FIATCEA) was developed and administered electronically. The FIATCEA consists of demographic characteristics and Likert scale items (1 = strongly disagree, 5 = strongly agree) about preferred CE activities and barriers to these activities. Internal consistency of survey items, as determined by Cronbach α, was 0.638 for preferred CE activities and 0.860 for barriers to these activities. Descriptive statistics were computed for all items. Differences between respondent demographic characteristics and preferred CE activities and barriers to these activities were determined via analysis of variance and dependent t tests. The α level was set at .05. Results: Hands-on clinical workshops and professional networking were the preferred formal and informal CE activities, respectively. The most frequently reported barriers to formal CE were the cost of attending and travel distance, whereas the most frequently reported barriers to informal CE were personal and job-specific factors. Differences were noted between both the cost of CE and travel distance to CE and all other barriers to CE participation (F1,411 = 233.54, P < .001). Conclusions: Overall, ATs preferred formal CE activities. The same barriers (eg, cost, travel distance) to formal CE appeared to be universal to all ATs. Informal CE was highly valued by ATs because it could be individualized. PMID:22488195

  13. Formal and Informal Continuing Education Activities and Athletic Training Professional Practice

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2010-01-01

    Abstract Context: Continuing education (CE) is intended to promote professional growth and, ultimately, to enhance professional practice. Objective: To determine certified athletic trainers' participation in formal (ie, approved for CE credit) and informal (ie, not approved for CE credit) CE activities and the perceived effect these activities have on professional practice with regard to improving knowledge, clinical skills and abilities, attitudes toward patient care, and patient care itself. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographic, stratified random sample of 1000 athletic trainers, 427 (42.7%) completed the survey. Main Outcome Measure(s): The Survey of Formal and Informal Athletic Training Continuing Education Activities was developed and administered electronically. The survey consisted of demographic characteristics and Likert-scale items regarding CE participation and perceived effect of CE on professional practice. Internal consistency of survey items was determined using the Cronbach α (α  =  0.945). Descriptive statistics were computed for all items. An analysis of variance and dependent t tests were calculated to determine differences among respondents' demographic characteristics and their participation in, and perceived effect of, CE activities. The α level was set at .05. Results: Respondents completed more informal CE activities than formal CE activities. Participation in informal CE activities included reading athletic training journals (75.4%), whereas formal CE activities included attending a Board of Certification–approved workshop, seminar, or professional conference not conducted by the National Athletic Trainers' Association or affiliates or committees (75.6%). Informal CE activities were perceived to improve clinical skills or abilities and attitudes toward patient care. Formal CE activities were perceived to enhance knowledge. Conclusions: More respondents completed informal CE activities than formal CE activities. Both formal and informal CE activities were perceived to enhance athletic training professional practice. Informal CE activities should be explored and considered for CE credit. PMID:20446842

  14. Investigation of pore size and energy distributions by statistical physics formalism applied to agriculture products

    NASA Astrophysics Data System (ADS)

    Aouaini, Fatma; Knani, Salah; Yahia, Manel Ben; Bahloul, Neila; Ben Lamine, Abdelmottaleb; Kechaou, Nabil

    2015-12-01

    In this paper, we present a new investigation that allows determining the pore size distribution (PSD) in a porous medium. This PSD is achieved by using the desorption isotherms of four varieties of olive leaves. This is by the means of statistical physics formalism and Kelvin's law. The results are compared with those obtained with scanning electron microscopy. The effect of temperature on the distribution function of pores has been studied. The influence of each parameter on the PSD is interpreted. A similar function of adsorption energy distribution, AED, is deduced from the PSD.

  15. Concept similarity and related categories in information retrieval using formal concept analysis

    NASA Astrophysics Data System (ADS)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  16. Strategies Used by Students to Compare Two Data Sets

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2012-01-01

    One of the common tasks of inferential statistics is to compare two data sets. Long before formal statistical procedures, however, students can be encouraged to make comparisons between data sets and therefore build up intuitive statistical reasoning. Such tasks also give meaning to the data collection students may do. This study describes the…

  17. For a statistical interpretation of Helmholtz' thermal displacement

    NASA Astrophysics Data System (ADS)

    Podio-Guidugli, Paolo

    2016-11-01

    On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.

  18. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  19. On the statistical distribution in a deformed solid

    NASA Astrophysics Data System (ADS)

    Gorobei, N. N.; Luk'yanenko, A. S.

    2017-09-01

    A modification of the Gibbs distribution in a thermally insulated mechanically deformed solid, where its linear dimensions (shape parameters) are excluded from statistical averaging and included among the macroscopic parameters of state alongside with the temperature, is proposed. Formally, this modification is reduced to corresponding additional conditions when calculating the statistical sum. The shape parameters and the temperature themselves are found from the conditions of mechanical and thermal equilibria of a body, and their change is determined using the first law of thermodynamics. Known thermodynamic phenomena are analyzed for the simple model of a solid, i.e., an ensemble of anharmonic oscillators, within the proposed formalism with an accuracy of up to the first order by the anharmonicity constant. The distribution modification is considered for the classic and quantum temperature regions apart.

  20. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  1. Putting it all together: Exhumation histories from a formal combination of heat flow and a suite of thermochronometers

    USGS Publications Warehouse

    d'Alessio, M. A.; Williams, C.F.

    2007-01-01

    A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.

  2. Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach.

    PubMed

    Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem

    2013-01-01

    This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff.

  3. Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach

    PubMed Central

    Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem

    2013-01-01

    Objectives This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. Methodology A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. Results The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. Conclusion This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff. PMID:23559904

  4. Considerations in the statistical analysis of clinical trials in periodontitis.

    PubMed

    Imrey, P B

    1986-05-01

    Adult periodontitis has been described as a chronic infectious process exhibiting sporadic, acute exacerbations which cause quantal, localized losses of dental attachment. Many analytic problems of periodontal trials are similar to those of other chronic diseases. However, the episodic, localized, infrequent, and relatively unpredictable behavior of exacerbations, coupled with measurement error difficulties, cause some specific problems. Considerable controversy exists as to the proper selection and treatment of multiple site data from the same patient for group comparisons for epidemiologic or therapeutic evaluative purposes. This paper comments, with varying degrees of emphasis, on several issues pertinent to the analysis of periodontal trials. Considerable attention is given to the ways in which measurement variability may distort analytic results. Statistical treatments of multiple site data for descriptive summaries are distinguished from treatments for formal statistical inference to validate therapeutic effects. Evidence suggesting that sites behave independently is contested. For inferential analyses directed at therapeutic or preventive effects, analytic models based on site independence are deemed unsatisfactory. Methods of summarization that may yield more powerful analyses than all-site mean scores, while retaining appropriate treatment of inter-site associations, are suggested. Brief comments and opinions on an assortment of other issues in clinical trial analysis are preferred.

  5. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  6. A Bayesian Interpretation of First-Order Phase Transitions

    NASA Astrophysics Data System (ADS)

    Davis, Sergio; Peralta, Joaquín; Navarrete, Yasmín; González, Diego; Gutiérrez, Gonzalo

    2016-03-01

    In this work we review the formalism used in describing the thermodynamics of first-order phase transitions from the point of view of maximum entropy inference. We present the concepts of transition temperature, latent heat and entropy difference between phases as emergent from the more fundamental concept of internal energy, after a statistical inference analysis. We explicitly demonstrate this point of view by making inferences on a simple game, resulting in the same formalism as in thermodynamical phase transitions. We show that analogous quantities will inevitably arise in any problem of inferring the result of a yes/no question, given two different states of knowledge and information in the form of expectation values. This exposition may help to clarify the role of these thermodynamical quantities in the context of different first-order phase transitions such as the case of magnetic Hamiltonians (e.g. the Potts model).

  7. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  8. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  9. Learning and understanding the Kruskal-Wallis one-way analysis-of-variance-by-ranks test for differences among three or more independent groups.

    PubMed

    Chan, Y; Walmsley, R P

    1997-12-01

    When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.

  10. Social capital and social support on the web: the case of an internet mother site.

    PubMed

    Drentea, Patricia; Moren-Cross, Jennifer L

    2005-11-01

    Do virtual communities in cyberspace foster social capital and social support? Using participant observation and discourse analysis, we examine a mothering board on a parent's website and investigate whether social capital was present, and if so, how it was developed and used. We find three main types of communication emerge from our analysis: emotional support, instrumental support--both formal and informal, and community building/protection, all of which contribute to the creation and maintenance of social capital. Additionally, using sampling with replacement, we created a final data set of 180 mothers and report descriptive statistics to identify characteristics of those on the board.

  11. SPSS and SAS procedures for estimating indirect effects in simple mediation models.

    PubMed

    Preacher, Kristopher J; Hayes, Andrew F

    2004-11-01

    Researchers often conduct mediation analysis in order to indirectly assess the effect of a proposed cause on some outcome through a proposed mediator. The utility of mediation analysis stems from its ability to go beyond the merely descriptive to a more functional understanding of the relationships among variables. A necessary component of mediation is a statistically and practically significant indirect effect. Although mediation hypotheses are frequently explored in psychological research, formal significance tests of indirect effects are rarely conducted. After a brief overview of mediation, we argue the importance of directly testing the significance of indirect effects and provide SPSS and SAS macros that facilitate estimation of the indirect effect with a normal theory approach and a bootstrap approach to obtaining confidence intervals, as well as the traditional approach advocated by Baron and Kenny (1986). We hope that this discussion and the macros will enhance the frequency of formal mediation tests in the psychology literature. Electronic copies of these macros may be downloaded from the Psychonomic Society's Web archive at www.psychonomic.org/archive/.

  12. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  13. Concepts of formal concept analysis

    NASA Astrophysics Data System (ADS)

    Žáček, Martin; Homola, Dan; Miarka, Rostislav

    2017-07-01

    The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.

  14. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Exact Solution of the Two-Level System and the Einstein Solid in the Microcanonical Formalism

    ERIC Educational Resources Information Center

    Bertoldi, Dalia S.; Bringa, Eduardo M.; Miranda, E. N.

    2011-01-01

    The two-level system and the Einstein model of a crystalline solid are taught in every course of statistical mechanics and they are solved in the microcanonical formalism because the number of accessible microstates can be easily evaluated. However, their solutions are usually presented using the Stirling approximation to deal with factorials. In…

  16. Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation.

    PubMed

    Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav

    2017-01-03

    Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.

  17. Inference as Prediction

    ERIC Educational Resources Information Center

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  18. Robust inference for group sequential trials.

    PubMed

    Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei

    2017-03-01

    For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.

  19. The parental alienation debate belongs in the courtroom, not in DSM-5.

    PubMed

    Houchin, Timothy M; Ranseen, John; Hash, Phillip A K; Bartnicki, Daniel J

    2012-01-01

    The DSM-5 Task Force is presently considering whether to adopt parental alienation disorder (PAD) as a mental illness. Although controversy has surrounded PAD since its inception in 1985, pro-PAD groups and individuals have breathed new life into the push to establish it as a mental health diagnosis. In this analysis, we argue that it would be a serious mistake to adopt parental alienation disorder as a formal mental illness in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5).

  20. Measuring effectiveness of drugs in observational databanks: promises and perils

    PubMed Central

    Krishnan, Eswar; Fries, James F

    2004-01-01

    Observational databanks have inherent strengths and shortcomings. As in randomized controlled trials, poor design of these databanks can either exaggerate or reduce estimates of drug effectiveness and can limit generalizability. This commentary highlights selected aspects of study design, data collection and statistical analysis that can help overcome many of these inadequacies. An international metaRegister and a formal mechanism for standardizing and sharing drug data could help improve the utility of databanks. Medical journals have a vital role in enforcing a quality checklist that improves reporting. PMID:15059263

  1. A stylistic classification of Russian-language texts based on the random walk model

    NASA Astrophysics Data System (ADS)

    Kramarenko, A. A.; Nekrasov, K. A.; Filimonov, V. V.; Zhivoderov, A. A.; Amieva, A. A.

    2017-09-01

    A formal approach to text analysis is suggested that is based on the random walk model. The frequencies and reciprocal positions of the vowel letters are matched up by a process of quasi-particle migration. Statistically significant difference in the migration parameters for the texts of different functional styles is found. Thus, a possibility of classification of texts using the suggested method is demonstrated. Five groups of the texts are singled out that can be distinguished from one another by the parameters of the quasi-particle migration process.

  2. Combining statistical inference and decisions in ecology

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.

    2016-01-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.

  3. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    NASA Astrophysics Data System (ADS)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  4. An analysis of the cognitive deficit of schizophrenia based on the Piaget developmental theory.

    PubMed

    Torres, Alejandro; Olivares, Jose M; Rodriguez, Angel; Vaamonde, Antonio; Berrios, German E

    2007-01-01

    The objective of the study was to evaluate from the perspective of the Piaget developmental model the cognitive functioning of a sample of patients diagnosed with schizophrenia. Fifty patients with schizophrenia (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition) and 40 healthy matched controls were evaluated by means of the Longeot Logical Thought Evaluation Scale. Only 6% of the subjects with schizophrenia reached the "formal period," and 70% remained at the "concrete operations" stage. The corresponding figures for the control sample were 25% and 15%, respectively. These differences were statistically significant. The samples were specifically differentiable on the permutation, probabilities, and pendulum tests of the scale. The Longeot Logical Thought Evaluation Scale can discriminate between subjects with schizophrenia and healthy controls.

  5. Effects of a finite outer scale on the measurement of atmospheric-turbulence statistics with a Hartmann wave-front sensor.

    PubMed

    Feng, Shen; Wenhan, Jiang

    2002-06-10

    Phase-structure and aperture-averaged slope-correlated functions with a finite outer scale are derived based on the Taylor hypothesis and a generalized spectrum, such as the von Kármán modal. The effects of the finite outer scale on measuring and determining the character of atmospheric-turbulence statistics are shown especially for an approximately 4-m class telescope and subaperture. The phase structure function and atmospheric coherent length based on the Kolmogorov model are approximations of the formalism we have derived. The analysis shows that it cannot be determined whether the deviation from the power-law parameter of Kolmogorov turbulence is caused by real variations of the spectrum or by the effect of the finite outer scale.

  6. MOOC & B-Learning: Students' Barriers and Satisfaction in Formal and Non-Formal Learning Environments

    ERIC Educational Resources Information Center

    Gutiérrez-Santiuste, Elba; Gámiz-Sánchez, Vanesa-M.; Gutiérrez-Pérez, Jose

    2015-01-01

    The study presents a comparative analysis of two virtual learning formats: one non-formal through a Massive Open Online Course (MOOC) and the other formal through b-learning. We compare the communication barriers and the satisfaction perceived by the students (N = 249) by developing a qualitative analysis using semi-structured questionnaires and…

  7. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  8. Potential-of-mean-force description of ionic interactions and structural hydration in biomolecular systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummer, G.; Garcia, A.E.; Soumpasis, D.M.

    1994-10-01

    To understand the functioning of living organisms on a molecular level, it is crucial to dissect the intricate interplay of the immense number of biological molecules. Most of the biochemical processes in cells occur in a liquid environment formed mainly by water and ions. This solvent environment plays an important role in biological systems. The potential-of-mean-force (PMF) formalism attempts to describe quantitatively the interactions of the solvent with biological macromolecules on the basis of an approximate statistical-mechanical representation. At its current status of development, it deals with ionic effects on the biomolecular structure and with the structural hydration of biomolecules.more » The underlying idea of the PMF formalism is to identify the dominant sources of interactions and incorporate these interactions into the theoretical formalism using PMF`s (or particle correlation functions) extracted from bulk-liquid systems. In the following, the authors shall briefly outline the statistical-mechanical foundation of the PMF formalism and introduce the PMF expansion formalism, which is intimately linked to superposition approximations for higher-order particle correlation functions. The authors shall then sketch applications, which describe the effects of the ionic environment on nucleic-acid structure. Finally, the authors shall present the more recent extension of the PMF idea to describe quantitatively the structural hydration of biomolecules. Results for the interface of ice and water and for the hydration of deoxyribonucleic acid (DNA) will be discussed.« less

  9. Correcting for population structure and kinship using the linear mixed model: theory and extensions.

    PubMed

    Hoffman, Gabriel E

    2013-01-01

    Population structure and kinship are widespread confounding factors in genome-wide association studies (GWAS). It has been standard practice to include principal components of the genotypes in a regression model in order to account for population structure. More recently, the linear mixed model (LMM) has emerged as a powerful method for simultaneously accounting for population structure and kinship. The statistical theory underlying the differences in empirical performance between modeling principal components as fixed versus random effects has not been thoroughly examined. We undertake an analysis to formalize the relationship between these widely used methods and elucidate the statistical properties of each. Moreover, we introduce a new statistic, effective degrees of freedom, that serves as a metric of model complexity and a novel low rank linear mixed model (LRLMM) to learn the dimensionality of the correction for population structure and kinship, and we assess its performance through simulations. A comparison of the results of LRLMM and a standard LMM analysis applied to GWAS data from the Multi-Ethnic Study of Atherosclerosis (MESA) illustrates how our theoretical results translate into empirical properties of the mixed model. Finally, the analysis demonstrates the ability of the LRLMM to substantially boost the strength of an association for HDL cholesterol in Europeans.

  10. Statistical analysis plan for evaluating low- vs. standard-dose alteplase in the ENhanced Control of Hypertension and Thrombolysis strokE stuDy (ENCHANTED).

    PubMed

    Anderson, Craig S; Woodward, Mark; Arima, Hisatomi; Chen, Xiaoying; Lindley, Richard I; Wang, Xia; Chalmers, John

    2015-12-01

    The ENhanced Control of Hypertension And Thrombolysis strokE stuDy trial is a 2 × 2 quasi-factorial active-comparison, prospective, randomized, open, blinded endpoint clinical trial that is evaluating in thrombolysis-eligible acute ischemic stroke patients whether: (1) low-dose (0·6 mg/kg body weight) intravenous alteplase has noninferior efficacy and lower risk of symptomatic intracerebral hemorrhage compared with standard-dose (0·9 mg/kg body weight) intravenous alteplase; and (2) early intensive blood pressure lowering (systolic target 130-140 mmHg) has superior efficacy and lower risk of any intracerebral hemorrhage compared with guideline-recommended blood pressure control (systolic target <180 mmHg). To outline in detail the predetermined statistical analysis plan for the 'alteplase dose arm' of the study. All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with appropriate comparisons made between randomized groups. For the trial outcomes, the most appropriate statistical comparisons to be made between groups are planned and described. A statistical analysis plan was developed for the results of the alteplase dose arm of the study that is transparent, available to the public, verifiable, and predetermined before completion of data collection. We have developed a predetermined statistical analysis plan for the ENhanced Control of Hypertension And Thrombolysis strokE stuDy alteplase dose arm which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. © 2015 The Authors. International Journal of Stroke published by John Wiley & Sons Ltd on behalf of World Stroke Organization.

  11. Open and Distance Learning and Information and Communication Technologies--Implications for Formal and Non-Formal Education: A Kenyan Case

    ERIC Educational Resources Information Center

    Situma, David Barasa

    2015-01-01

    The female population in Kenya was reported at 50.05% in 2011, according to a World Bank report published in 2012. Despite this slightly higher percentage over males, women in Kenya are not well represented in education and training compared to their male counterparts (Kenya National Bureau of Statistics, 2012). The need to empower girls and women…

  12. Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.

    PubMed

    Venturi, D; Karniadakis, G E

    2014-06-08

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.

  13. Upside/Downside statistical mechanics of nonequilibrium Brownian motion. I. Distributions, moments, and correlation functions of a free particle.

    PubMed

    Craven, Galen T; Nitzan, Abraham

    2018-01-28

    Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.

  14. Upside/Downside statistical mechanics of nonequilibrium Brownian motion. I. Distributions, moments, and correlation functions of a free particle

    NASA Astrophysics Data System (ADS)

    Craven, Galen T.; Nitzan, Abraham

    2018-01-01

    Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.

  15. Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems

    PubMed Central

    Venturi, D.; Karniadakis, G. E.

    2014-01-01

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519

  16. Informalisation of women's work: consequence for fertility and child schooling in urban Pakistan.

    PubMed

    Kazi, S; Sathar, Z A

    1993-01-01

    The preliminary analysis of data from the 1990-91 Pakistan Household Survey (PIHS) for urban areas yields a profile of working urban women by educational level, sector of the economy, and child's educational activities. Between 1971 and 1988 labor force participation rates (LFPR) for women ranged between 3% and 5%. The hiring of women in temporary positions allows for lower costs, less benefits, and freedom from restrictive legislation. The PIHS data on 4711 households and 2513 urban, ever married women aged 15-49 years indicates a LFPR for women of 17%. Under 20% work in the formal sector. Most work in their homes as unpaid family workers or home-based income earning producers. Many official statistics exclude these women. Informal sector workers in the PIHS data, such as low status domestic workers, receive average wages of 609 rupees monthly compared to home-based workers wages of 240 rupees. Formal sector female workers have completed an average of 11.4 years of schooling, while informal workers have received only 6.5 years. 77% of informal workers have had no formal education compared to 62% of at home mothers and 28% of formal sector workers. Many employed women are single household heads or with an unemployed spouse. Formal sector working women marry 3.4 years later than informal sector women and 2.6 years later than nonworking women. Nonworking women have the lowest contraceptive use followed by informal sector women. Most women regardless of work status desire four children, but achieved fertility was lower among professional and white collar workers. Informal sector women had higher fertility than nonworking women. Preliminary multivariate analyses supported this pattern of work status related fertility. The chances of children attending school was higher among formal sector workers. Girls with nonworking mothers had better chances of gaining an education.

  17. Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.

    PubMed

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.

  18. Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data

    PubMed Central

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786

  19. Deep first formal concept search.

    PubMed

    Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu

    2014-01-01

    The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.

  20. Universal calculational recipe for solvent-mediated potential: based on a combination of integral equation theory and density functional theory

    NASA Astrophysics Data System (ADS)

    Zhou, Shiqi

    2004-07-01

    A universal formalism, which enables calculation of solvent-mediated potential (SMP) between two equal or non-equal solute particles with any shape immersed in solvent reservior consisting of atomic particle and/or polymer chain or their mixture, is proposed by importing a density functional theory externally into OZ equation systems. Only if size asymmetry of the solvent bath components is moderate, the present formalism can calculate the SMP in any complex fluids at the present development stage of statistical mechanics, and therefore avoids all of limitations of previous approaches for SMP. Preliminary calculation indicates the reliability of the present formalism.

  1. Statistical mechanics of the Huxley-Simmons model

    NASA Astrophysics Data System (ADS)

    Caruel, M.; Truskinovsky, L.

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  2. Variation in Lithic Technological Strategies among the Neanderthals of Gibraltar

    PubMed Central

    Shipton, Ceri; Clarkson, Christopher; Bernal, Marco Antonio; Boivin, Nicole; Finlayson, Clive; Finlayson, Geraldine; Fa, Darren; Pacheco, Francisco Giles; Petraglia, Michael

    2013-01-01

    The evidence for Neanderthal lithic technology is reviewed and summarized for four caves on The Rock of Gibraltar: Vanguard, Beefsteak, Ibex and Gorham’s. Some of the observed patterns in technology are statistically tested including raw material selection, platform preparation, and the use of formal and expedient technological schemas. The main parameters of technological variation are examined through detailed analysis of the Gibraltar cores and comparison with samples from the classic Mousterian sites of Le Moustier and Tabun C. The Gibraltar Mousterian, including the youngest assemblage from Layer IV of Gorham’s Cave, spans the typical Middle Palaeolithic range of variation from radial Levallois to unidirectional and multi-platform flaking schemas, with characteristic emphasis on the former. A diachronic pattern of change in the Gorham’s Cave sequence is documented, with the younger assemblages utilising more localized raw material and less formal flaking procedures. We attribute this change to a reduction in residential mobility as the climate deteriorated during Marine Isotope Stage 3 and the Neanderthal population contracted into a refugium. PMID:23762312

  3. On the statistical significance of excess events: Remarks of caution and the need for a standard method of calculation

    NASA Technical Reports Server (NTRS)

    Staubert, R.

    1985-01-01

    Methods for calculating the statistical significance of excess events and the interpretation of the formally derived values are discussed. It is argued that a simple formula for a conservative estimate should generally be used in order to provide a common understanding of quoted values.

  4. Blended particle filters for large-dimensional chaotic dynamical systems

    PubMed Central

    Majda, Andrew J.; Qi, Di; Sapsis, Themistoklis P.

    2014-01-01

    A major challenge in contemporary data science is the development of statistically accurate particle filters to capture non-Gaussian features in large-dimensional chaotic dynamical systems. Blended particle filters that capture non-Gaussian features in an adaptively evolving low-dimensional subspace through particles interacting with evolving Gaussian statistics on the remaining portion of phase space are introduced here. These blended particle filters are constructed in this paper through a mathematical formalism involving conditional Gaussian mixtures combined with statistically nonlinear forecast models compatible with this structure developed recently with high skill for uncertainty quantification. Stringent test cases for filtering involving the 40-dimensional Lorenz 96 model with a 5-dimensional adaptive subspace for nonlinear blended filtering in various turbulent regimes with at least nine positive Lyapunov exponents are used here. These cases demonstrate the high skill of the blended particle filter algorithms in capturing both highly non-Gaussian dynamical features as well as crucial nonlinear statistics for accurate filtering in extreme filtering regimes with sparse infrequent high-quality observations. The formalism developed here is also useful for multiscale filtering of turbulent systems and a simple application is sketched below. PMID:24825886

  5. Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John A. Krommes

    2001-02-16

    A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which providesmore » a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations, stochasticity criteria for quasilinear theory, formal aspects of resonance-broadening theory, Novikov's theorem, the treatment of weak inhomogeneity, the derivation of the Vlasov weak-turbulence wave kinetic equation from a fully renormalized description, some features of a code for solving the direct-interaction approximation and related Markovian closures, the details of the solution of the EDQNM closure for a solvable three-wave model, and the notation used in the article.« less

  6. Vaccine stability study design and analysis to support product licensure.

    PubMed

    Schofield, Timothy L

    2009-11-01

    Stability evaluation supporting vaccine licensure includes studies of bulk intermediates as well as final container product. Long-term and accelerated studies are performed to support shelf life and to determine release limits for the vaccine. Vaccine shelf life is best determined utilizing a formal statistical evaluation outlined in the ICH guidelines, while minimum release is calculated to help assure adequate potency through handling and storage of the vaccine. In addition to supporting release potency determination, accelerated stability studies may be used to support a strategy to recalculate product expiry after an unintended temperature excursion such as a cold storage unit failure or mishandling during transport. Appropriate statistical evaluation of vaccine stability data promotes strategic stability study design, in order to reduce the uncertainty associated with the determination of the degradation rate, and the associated risk to the customer.

  7. Analysis of scattering statistics and governing distribution functions in optical coherence tomography.

    PubMed

    Sugita, Mitsuro; Weatherbee, Andrew; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex

    2016-07-01

    The probability density function (PDF) of light scattering intensity can be used to characterize the scattering medium. We have recently shown that in optical coherence tomography (OCT), a PDF formalism can be sensitive to the number of scatterers in the probed scattering volume and can be represented by the K-distribution, a functional descriptor for non-Gaussian scattering statistics. Expanding on this initial finding, here we examine polystyrene microsphere phantoms with different sphere sizes and concentrations, and also human skin and fingernail in vivo. It is demonstrated that the K-distribution offers an accurate representation for the measured OCT PDFs. The behavior of the shape parameter of K-distribution that best fits the OCT scattering results is investigated in detail, and the applicability of this methodology for biological tissue characterization is demonstrated and discussed.

  8. Discrete approach to stochastic parametrization and dimension reduction in nonlinear dynamics.

    PubMed

    Chorin, Alexandre J; Lu, Fei

    2015-08-11

    Many physical systems are described by nonlinear differential equations that are too complicated to solve in full. A natural way to proceed is to divide the variables into those that are of direct interest and those that are not, formulate solvable approximate equations for the variables of greater interest, and use data and statistical methods to account for the impact of the other variables. In the present paper we consider time-dependent problems and introduce a fully discrete solution method, which simplifies both the analysis of the data and the numerical algorithms. The resulting time series are identified by a NARMAX (nonlinear autoregression moving average with exogenous input) representation familiar from engineering practice. The connections with the Mori-Zwanzig formalism of statistical physics are discussed, as well as an application to the Lorenz 96 system.

  9. INFORMATION: THEORY, BRAIN, AND BEHAVIOR

    PubMed Central

    Jensen, Greg; Ward, Ryan D.; Balsam, Peter D.

    2016-01-01

    In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified. PMID:24122456

  10. Hunting Solomonoff's Swans: Exploring the Boundary Between Physics and Statistics in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2014-12-01

    Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.

  11. Combining statistical inference and decisions in ecology.

    PubMed

    Williams, Perry J; Hooten, Mevin B

    2016-09-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.

  12. An Analysis of the Formal Features of "Reality-Based" Television Programs.

    ERIC Educational Resources Information Center

    Neapolitan, D. M.

    Reality-based television programs showcase actual footage or recreate actual events, and include programs such as "America's Most Wanted" and "Rescue 911." To identify the features that typify reality-based television programs, this study conducted an analysis of formal features used in reality-based programs. Formal features…

  13. A round-robin gamma stereotactic radiosurgery dosimetry interinstitution comparison of calibration protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drzymala, R. E., E-mail: drzymala@wustl.edu; Alvarez, P. E.; Bednarz, G.

    2015-11-15

    Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods:more » Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS dose-rate were 0.999 ± 0.009 (TG-21), 0.991 ± 0.013 (TG-51), 1.000 ± 0.009 (IAEA), and 1.009 ± 0.012 (in-air). There were no statistically significant differences (i.e., p > 0.05) between the two ionization chambers for the TG-21 protocol applied to all dosimetry phantoms. The mean results using the TG-51 protocol were notably lower than those for the other dosimetry protocols, with a standard deviation 2–3 times larger. The in-air protocol was not statistically different from TG-21 for the A16 chamber in the liquid water or ABS phantoms (p = 0.300 and p = 0.135) but was statistically different from TG-21 for the PTW chamber in all phantoms (p = 0.006 for Solid Water, 0.014 for liquid water, and 0.020 for ABS). Results of IAEA formalism were statistically different from TG-21 results only for the combination of the A16 chamber with the liquid water phantom (p = 0.017). In the latter case, dose-rates measured with the two protocols differed by only 0.4%. For other phantom-ionization-chamber combinations, the new IAEA formalism was not statistically different from TG-21. Conclusions: Although further investigation is needed to validate the new protocols for other ionization chambers, these results can serve as a reference to quantitatively compare different calibration protocols and ionization chambers if a particular method is chosen by a professional society to serve as a standardized calibration protocol.« less

  14. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  15. Forming a joint dialogue among faith healers, traditional healers and formal health workers in mental health in a Kenyan setting: towards common grounds.

    PubMed

    Musyimi, Christine W; Mutiso, Victoria N; Nandoya, Erick S; Ndetei, David M

    2016-01-07

    Qualitative evidence on dialogue formation and collaboration is very scanty in Kenya. This study thus aimed at the formation of dialogue and establishment of collaboration among the informal (faith and traditional healers) and formal health workers (clinicians) in enhancing community-based mental health in rural Kenya. Qualitative approach was used to identify barriers and solutions for dialogue formation by conducting nine Focus Group Discussions each consisting of 8-10 participants. Information on age, gender and role in health care setting as well as practitioners' (henceforth used to mean informal (faith and traditional healers) and formal health workers) perceptions on dialogue was collected to evaluate dialogue formation. Qualitative and quantitative data analysis was performed using thematic content analysis and Statistical Package Social Sciences (SPSS) software respectively. We identified four dominant themes such as; (i) basic understanding about mental illnesses, (ii) interaction and treatment skills of the respondents to mentally ill persons, (iii) referral gaps and mistrust among the practitioners and (iv) dialogue formation among the practitioners. Although participants were conversant with the definition of mental illness and had interacted with a mentally ill person in their routine practice, they had basic information on the causes and types of mental illness. Traditional and faith healers felt demeaned by the clinicians who disregarded their mode of treatment stereotyping them as "dirty". After various discussions, majority of practitioners showed interest in collaborating with each other and stated that they had joined the dialogue in order interact with people committed to improving the lives of patients. Dialogue formation between the formal and the informal health workers is crucial in establishing trust and respect between both practitioners and in improving mental health care in Kenya. This approach could be scaled up among all the registered traditional and faith healers in Kenya.

  16. Who serves the urban poor? A geospatial and descriptive analysis of health services in slum settlements in Dhaka, Bangladesh

    PubMed Central

    Adams, Alayne M; Islam, Rubana; Ahmed, Tanvir

    2015-01-01

    In Bangladesh, the health risks of unplanned urbanization are disproportionately shouldered by the urban poor. At the same time, affordable formal primary care services are scarce, and what exists is almost exclusively provided by non-government organizations (NGOs) working on a project basis. So where do the poor go for health care? A health facility mapping of six urban slum settlements in Dhaka was undertaken to explore the configuration of healthcare services proximate to where the poor reside. Three methods were employed: (1) Social mapping and listing of all Health Service Delivery Points (HSDPs); (2) Creation of a geospatial map including Global Positioning System (GPS) co-ordinates of all HSPDs in the six study areas and (3) Implementation of a facility survey of all HSDPs within six study areas. Descriptive statistics are used to examine the number, type and concentration of service provider types, as well as indicators of their accessibility in terms of location and hours of service. A total of 1041 HSDPs were mapped, of which 80% are privately operated and the rest by NGOs and the public sector. Phamacies and non-formal or traditional doctors make up 75% of the private sector while consultation chambers account for 20%. Most NGO and Urban Primary Health Care Project (UPHCP) static clinics are open 5–6 days/week, but close by 4–5 pm in the afternoon. Evening services are almost exclusively offered by private HSDPs; however, only 37% of private sector health staff possess some kind of formal medical qualification. This spatial analysis of health service supply in poor urban settlements emphasizes the importance of taking the informal private sector into account in efforts to increase effective coverage of quality services. Features of informal private sector service provision that have facilitated market penetration may be relevant in designing formal services that better meet the needs of the urban poor. PMID:25759453

  17. How Framing Statistical Statements Affects Subjective Veracity: Validation and Application of a Multinomial Model for Judgments of Truth

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.

    2012-01-01

    Extending the well-established negativity bias in human cognition to truth judgments, it was recently shown that negatively framed statistical statements are more likely to be considered true than formally equivalent statements framed positively. However, the underlying processes responsible for this effect are insufficiently understood.…

  18. Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design

    ERIC Educational Resources Information Center

    Bennett, Kimberley Ann

    2015-01-01

    Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…

  19. Statistics of primordial density perturbations from discrete seed masses

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.; Bertschinger, Edmund

    1991-01-01

    The statistics of density perturbations for general distributions of seed masses with arbitrary matter accretion is examined. Formal expressions for the power spectrum, the N-point correlation functions, and the density distribution function are derived. These results are applied to the case of uncorrelated seed masses, and power spectra are derived for accretion of both hot and cold dark matter plus baryons. The reduced moments (cumulants) of the density distribution are computed and used to obtain a series expansion for the density distribution function. Analytic results are obtained for the density distribution function in the case of a distribution of seed masses with a spherical top-hat accretion pattern. More generally, the formalism makes it possible to give a complete characterization of the statistical properties of any random field generated from a discrete linear superposition of kernels. In particular, the results can be applied to density fields derived by smoothing a discrete set of points with a window function.

  20. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    NASA Astrophysics Data System (ADS)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-08-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.

  1. Superstatistics with different kinds of distributions in the deformed formalism

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-03-01

    In this article, after first introducing superstatistics, the effective Boltzmann factor in a deformed formalism for modified Dirac delta, uniform, two-level and Gamma distributions is derived. Then we make use of the superstatistics for four important problems in physics and the thermodynamic properties of the system are calculated. All results in the limit case are reduced to ordinary statistical mechanics. Furthermore, effects of all parameters in the problems are calculated and shown graphically.

  2. Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric; Duraisamy, Karthk

    2017-11-01

    The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  3. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  4. Exploring connections between statistical mechanics and Green's functions for realistic systems: Temperature dependent electronic entropy and internal energy from a self-consistent second-order Green's function

    NASA Astrophysics Data System (ADS)

    Welden, Alicia Rae; Rusakov, Alexander A.; Zgid, Dominika

    2016-11-01

    Including finite-temperature effects from the electronic degrees of freedom in electronic structure calculations of semiconductors and metals is desired; however, in practice it remains exceedingly difficult when using zero-temperature methods, since these methods require an explicit evaluation of multiple excited states in order to account for any finite-temperature effects. Using a Matsubara Green's function formalism remains a viable alternative, since in this formalism it is easier to include thermal effects and to connect the dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. However, despite the promising properties of this formalism, little is known about the multiple solutions of the non-linear equations present in the self-consistent Matsubara formalism and only a few cases involving a full Coulomb Hamiltonian were investigated in the past. Here, to shed some light onto the iterative nature of the Green's function solutions, we self-consistently evaluate the thermodynamic quantities for a one-dimensional (1D) hydrogen solid at various interatomic separations and temperatures using the self-energy approximated to second-order (GF2). At many points in the phase diagram of this system, multiple phases such as a metal and an insulator exist, and we are able to determine the most stable phase from the analysis of Helmholtz energies. Additionally, we show the evolution of the spectrum of 1D boron nitride to demonstrate that GF2 is capable of qualitatively describing the temperature effects influencing the size of the band gap.

  5. Critical Analysis on Open Source LMSs Using FCA

    ERIC Educational Resources Information Center

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  6. Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.

    PubMed

    Li, Wen-Chin; Harris, Don; Yu, Chung-San

    2008-03-01

    The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents.

  7. Nonparametric Residue Analysis of Dynamic PET Data With Application to Cerebral FDG Studies in Normals.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Spence, Alexander M; Mankoff, David M; O'Sullivan, Janet N; Fitzgerald, Niall; Newman, George C; Krohn, Kenneth A

    2009-06-01

    Kinetic analysis is used to extract metabolic information from dynamic positron emission tomography (PET) uptake data. The theory of indicator dilutions, developed in the seminal work of Meier and Zierler (1954), provides a probabilistic framework for representation of PET tracer uptake data in terms of a convolution between an arterial input function and a tissue residue. The residue is a scaled survival function associated with tracer residence in the tissue. Nonparametric inference for the residue, a deconvolution problem, provides a novel approach to kinetic analysis-critically one that is not reliant on specific compartmental modeling assumptions. A practical computational technique based on regularized cubic B-spline approximation of the residence time distribution is proposed. Nonparametric residue analysis allows formal statistical evaluation of specific parametric models to be considered. This analysis needs to properly account for the increased flexibility of the nonparametric estimator. The methodology is illustrated using data from a series of cerebral studies with PET and fluorodeoxyglucose (FDG) in normal subjects. Comparisons are made between key functionals of the residue, tracer flux, flow, etc., resulting from a parametric (the standard two-compartment of Phelps et al. 1979) and a nonparametric analysis. Strong statistical evidence against the compartment model is found. Primarily these differences relate to the representation of the early temporal structure of the tracer residence-largely a function of the vascular supply network. There are convincing physiological arguments against the representations implied by the compartmental approach but this is the first time that a rigorous statistical confirmation using PET data has been reported. The compartmental analysis produces suspect values for flow but, notably, the impact on the metabolic flux, though statistically significant, is limited to deviations on the order of 3%-4%. The general advantage of the nonparametric residue analysis is the ability to provide a valid kinetic quantitation in the context of studies where there may be heterogeneity or other uncertainty about the accuracy of a compartmental model approximation of the tissue residue.

  8. A New Measure of Text Formality: An Analysis of Discourse of Mao Zedong

    ERIC Educational Resources Information Center

    Li, Haiying; Graesser, Arthur C.; Conley, Mark; Cai, Zhiqiang; Pavlik, Philip I., Jr.; Pennebaker, James W.

    2016-01-01

    Formality has long been of interest in the study of discourse, with periodic discussions of the best measure of formality and the relationship between formality and text categories. In this research, we explored what features predict formality as humans perceive the construct. We categorized a corpus consisting of 1,158 discourse samples published…

  9. Statistical Neurodynamics.

    NASA Astrophysics Data System (ADS)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better understanding of the behavior of these systems.

  10. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  11. Tsallis and Kaniadakis statistics from a point of view of the holographic equipartition law

    NASA Astrophysics Data System (ADS)

    Abreu, Everton M. C.; Ananias Neto, Jorge; Mendes, Albert C. R.; Bonilla, Alexander

    2018-02-01

    In this work, we have illustrated the difference between both Tsallis and Kaniadakis entropies through cosmological models obtained from the formalism proposed by Padmanabhan, which is called holographic equipartition law. Similarly to the formalism proposed by Komatsu, we have obtained an extra driving constant term in the Friedmann equation if we deform the Tsallis entropy by Kaniadakis' formalism. We have considered initially Tsallis entropy as the black-hole (BH) area entropy. This constant term may lead the universe to be in an accelerated or decelerated mode. On the other hand, if we start with the Kaniadakis entropy as the BH area entropy and then by modifying the Kappa expression by Tsallis' formalism, the same absolute value but with opposite sign is obtained. In an opposite limit, no driving inflation term of the early universe was derived from both deformations.

  12. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  13. The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2002-01-01

    The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.

  14. Cost-Benefit Analysis of U.S. Copyright Formalities. Final Report.

    ERIC Educational Resources Information Center

    King Research, Inc., Rockville, MD.

    This study of the feasibility of conducting a cost-benefit analysis in the complex environment of the formalities used in the United States as part of its administration of the copyright law focused on the formalities of copyright notice, deposit, registration, and recordation. The U.S. system is also compared with the less centralized copyright…

  15. Vibrational algorithms for quantitative crystallographic analyses of hydroxyapatite-based biomaterials: I, theoretical foundations.

    PubMed

    Pezzotti, Giuseppe; Zhu, Wenliang; Boffelli, Marco; Adachi, Tetsuya; Ichioka, Hiroaki; Yamamoto, Toshiro; Marunaka, Yoshinori; Kanamura, Narisato

    2015-05-01

    The Raman spectroscopic method has quantitatively been applied to the analysis of local crystallographic orientation in both single-crystal hydroxyapatite and human teeth. Raman selection rules for all the vibrational modes of the hexagonal structure were expanded into explicit functions of Euler angles in space and six Raman tensor elements (RTE). A theoretical treatment has also been put forward according to the orientation distribution function (ODF) formalism, which allows one to resolve the statistical orientation patterns of the nm-sized hydroxyapatite crystallite comprised in the Raman microprobe. Close-form solutions could be obtained for the Euler angles and their statistical distributions resolved with respect to the direction of the average texture axis. Polarized Raman spectra from single-crystalline hydroxyapatite and textured polycrystalline (teeth enamel) samples were compared, and a validation of the proposed Raman method could be obtained through confirming the agreement between RTE values obtained from different samples.

  16. Asteroid orbital error analysis: Theory and application

    NASA Technical Reports Server (NTRS)

    Muinonen, K.; Bowell, Edward

    1992-01-01

    We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).

  17. Experimental observation of steady inertial wave turbulence in deep rotating flows

    NASA Astrophysics Data System (ADS)

    Yarom, Ehud; Sharon, Eran

    2015-11-01

    We present experimental evidence of inertial wave turbulence in deep rotating fluid. Experiments were performed in a rotating cylindrical water tank, where previous work showed statistics similar to 2D turbulence (specifically an inverse energy cascade). Using Fourier analysis of high resolution data in both space (3D) and time we show that most of the energy of a steady state flow is contained around the inertial wave dispersion relation. The nonlinear interaction between the waves is manifested by the widening of the time spectrum around the dispersion relation. We show that as the Rossby number increases so does the spectrum width, with a strong dependence on wave number. Our results suggest that in some parameters range, rotating turbulence velocity field can be represented as a field of interacting waves (wave turbulence). Such formalism may provide a better understanding of the flow statistics. This work was supported by the Israel Science Foundation, Grant No. 81/12.

  18. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  19. Complex polarization-phase and spatial-frequency selections of laser images of blood-plasma films in diagnostics of changes in their polycrystalline structure

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. A.; Angelskii, P. O.; Dubolazov, A. V.; Karachevtsev, A. O.; Sidor, M. I.; Mintser, O. P.; Oleinichenko, B. P.; Bizer, L. I.

    2013-10-01

    We present a theoretical formalism of correlation phase analysis of laser images of human blood plasma with spatial-frequency selection of manifestations of mechanisms of linear and circular birefringence of albumin and globulin polycrystalline networks. Comparative results of the measurement of coordinate distributions of the correlation parameter—the modulus of the degree of local correlation of amplitudes—of laser images of blood plasma taken from patients of three groups—healthy patients (donors), rheumatoid-arthritis patients, and breast-cancer patients—are presented. We investigate values and ranges of change of statistical (the first to fourth statistical moments), correlation (excess of autocorrelation functions), and fractal (slopes of approximating curves and dispersion of extrema of logarithmic dependences of power spectra) parameters of coordinate distributions of the degree of local correlation of amplitudes. Objective criteria for diagnostics of occurrence and differentiation of inflammatory and oncological states are determined.

  20. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  1. Multivariate meta-analysis for non-linear and other multi-parameter associations

    PubMed Central

    Gasparrini, A; Armstrong, B; Kenward, M G

    2012-01-01

    In this paper, we formalize the application of multivariate meta-analysis and meta-regression to synthesize estimates of multi-parameter associations obtained from different studies. This modelling approach extends the standard two-stage analysis used to combine results across different sub-groups or populations. The most straightforward application is for the meta-analysis of non-linear relationships, described for example by regression coefficients of splines or other functions, but the methodology easily generalizes to any setting where complex associations are described by multiple correlated parameters. The modelling framework of multivariate meta-analysis is implemented in the package mvmeta within the statistical environment R. As an illustrative example, we propose a two-stage analysis for investigating the non-linear exposure–response relationship between temperature and non-accidental mortality using time-series data from multiple cities. Multivariate meta-analysis represents a useful analytical tool for studying complex associations through a two-stage procedure. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22807043

  2. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.

  3. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    USGS Publications Warehouse

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  4. "Of Course I'm Communicating; I Lecture Every Day": Enhancing Teaching and Learning in Introductory Statistics. Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Wulff, Shaun S.; Wulff, Donald H.

    2004-01-01

    This article focuses on one instructor's evolution from formal lecturing to interactive teaching and learning in a statistics course. Student perception data are used to demonstrate the instructor's use of communication to align the content, students, and instructor throughout the course. Results indicate that the students learned, that…

  5. Can Propensity Score Analysis Approximate Randomized Experiments Using Pretest and Demographic Information in Pre-K Intervention Research?

    PubMed

    Dong, Nianbo; Lipsey, Mark W

    2017-01-01

    It is unclear whether propensity score analysis (PSA) based on pretest and demographic covariates will meet the ignorability assumption for replicating the results of randomized experiments. This study applies within-study comparisons to assess whether pre-Kindergarten (pre-K) treatment effects on achievement outcomes estimated using PSA based on a pretest and demographic covariates can approximate those found in a randomized experiment. Data-Four studies with samples of pre-K children each provided data on two math achievement outcome measures with baseline pretests and child demographic variables that included race, gender, age, language spoken at home, and mother's highest education. Research Design and Data Analysis-A randomized study of a pre-K math curriculum provided benchmark estimates of effects on achievement measures. Comparison samples from other pre-K studies were then substituted for the original randomized control and the effects were reestimated using PSA. The correspondence was evaluated using multiple criteria. The effect estimates using PSA were in the same direction as the benchmark estimates, had similar but not identical statistical significance, and did not differ from the benchmarks at statistically significant levels. However, the magnitude of the effect sizes differed and displayed both absolute and relative bias larger than required to show statistical equivalence with formal tests, but those results were not definitive because of the limited statistical power. We conclude that treatment effect estimates based on a single pretest and demographic covariates in PSA correspond to those from a randomized experiment on the most general criteria for equivalence.

  6. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  7. Condensate statistics in interacting and ideal dilute bose gases

    PubMed

    Kocharovsky; Kocharovsky; Scully

    2000-03-13

    We obtain analytical formulas for the statistics, in particular, for the characteristic function and all cumulants, of the Bose-Einstein condensate in dilute weakly interacting and ideal equilibrium gases in the canonical ensemble via the particle-number-conserving operator formalism of Girardeau and Arnowitt. We prove that the ground-state occupation statistics is not Gaussian even in the thermodynamic limit. We calculate the effect of Bogoliubov coupling on suppression of ground-state occupation fluctuations and show that they are governed by a pair-correlation, squeezing mechanism.

  8. Regular Formal Evaluation Sessions are Effective as Frame-of-Reference Training for Faculty Evaluators of Clerkship Medical Students.

    PubMed

    Hemmer, Paul A; Dadekian, Gregory A; Terndrup, Christopher; Pangaro, Louis N; Weisbrod, Allison B; Corriere, Mark D; Rodriguez, Rechell; Short, Patricia; Kelly, William F

    2015-09-01

    Face-to-face formal evaluation sessions between clerkship directors and faculty can facilitate the collection of trainee performance data and provide frame-of-reference training for faculty. We hypothesized that ambulatory faculty who attended evaluation sessions at least once in an academic year (attendees) would use the Reporter-Interpreter-Manager/Educator (RIME) terminology more appropriately than faculty who did not attend evaluation sessions (non-attendees). Investigators conducted a retrospective cohort study using the narrative assessments of ambulatory internal medicine clerkship students during the 2008-2009 academic year. The study included assessments of 49 clerkship medical students, which comprised 293 individual teacher narratives. Single-teacher written and transcribed verbal comments about student performance were masked and reviewed by a panel of experts who, by consensus, (1) determined whether RIME was used, (2) counted the number of RIME utterances, and (3) assigned a grade based on the comments. Analysis included descriptive statistics and Pearson correlation coefficients. The authors reviewed 293 individual teacher narratives regarding the performance of 49 students. Attendees explicitly used RIME more frequently than non-attendees (69.8 vs. 40.4 %; p < 0.0001). Grades recommended by attendees correlated more strongly with grades assigned by experts than grades recommended by non-attendees (r = 0.72; 95 % CI (0.65, 0.78) vs. 0.47; 95 % CI (0.26, 0.64); p = 0.005). Grade recommendations from individual attendees and non-attendees each correlated significantly with overall student clerkship clinical performance [r = 0.63; 95 % CI (0.54, 0.71) vs. 0.52 (0.36, 0.66), respectively], although the difference between the groups was not statistically significant (p = 0.21). On an ambulatory clerkship, teachers who attended evaluation sessions used RIME terminology more frequently and provided more accurate grade recommendations than teachers who did not attend. Formal evaluation sessions may provide frame-of-reference training for the RIME framework, a method that improves the validity and reliability of workplace assessment.

  9. Formalization of an environmental model using formal concept analysis - FCA

    NASA Astrophysics Data System (ADS)

    Bourdon-García, Rubén D.; Burgos-Salcedo, Javier D.

    2016-08-01

    Nowadays, there is a huge necessity to generate novel strategies for social-ecological systems analyses for resolving global sustainability problems. This paper has as main purpose the application of the formal concept analysis to formalize the theory of Augusto Ángel Maya, who without a doubt, was one of the most important environmental philosophers in South America; Ángel Maya proposed and established that Ecosystem-Culture relations, instead Human-Nature ones, are determinants in our understanding and management of natural resources. Based on this, a concept lattice, formal concepts, subconcept-superconcept relations, partially ordered sets, supremum and infimum of the lattice and implications between attributes (Duquenne-Guigues base), were determined for the ecosystem-culture relations.

  10. An intervention for early mathematical success: outcomes from the hybrid version of the Building Math Readiness Parents as Partners (MRPP) project.

    PubMed

    Kritzer, Karen L; Pagliaro, Claudia M

    2013-01-01

    The Building Math Readiness in Young Deaf/Hard-of- Hearing Children: Parents as Partners (MRPP) Project works with parents to increase the understanding of foundational mathematics concepts in their preschool deaf/hard-of-hearing (d/hh) children in preparation for formal mathematics education. A multiple-case/single-unit case study incorporating descriptive statistics and grounded theory analysis was conducted on the hybrid version of the intervention. Results showed productive changes in parental behaviors indicating a possible positive effect on parent knowledge, recognition, and mediation of early matthematics concepts with their young d/hh children.

  11. A microscopic model of the Stokes-Einstein relation in arbitrary dimension.

    PubMed

    Charbonneau, Benoit; Charbonneau, Patrick; Szamel, Grzegorz

    2018-06-14

    The Stokes-Einstein relation (SER) is one of the most robust and widely employed results from the theory of liquids. Yet sizable deviations can be observed for self-solvation, which cannot be explained by the standard hydrodynamic derivation. Here, we revisit the work of Masters and Madden [J. Chem. Phys. 74, 2450-2459 (1981)], who first solved a statistical mechanics model of the SER using the projection operator formalism. By generalizing their analysis to all spatial dimensions and to partially structured solvents, we identify a potential microscopic origin of some of these deviations. We also reproduce the SER-like result from the exact dynamics of infinite-dimensional fluids.

  12. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  13. Chemodetection in fluctuating environments: receptor coupling, buffering, and antagonism.

    PubMed

    Lalanne, Jean-Benoît; François, Paul

    2015-02-10

    Variability in the chemical composition of the extracellular environment can significantly degrade the ability of cells to detect rare cognate ligands. Using concepts from statistical detection theory, we formalize the generic problem of detection of small concentrations of ligands in a fluctuating background of biochemically similar ligands binding to the same receptors. We discover that in contrast with expectations arising from considerations of signal amplification, inhibitory interactions between receptors can improve detection performance in the presence of substantial environmental variability, providing an adaptive interpretation to the phenomenon of ligand antagonism. Our results suggest that the structure of signaling pathways responsible for chemodetection in fluctuating and heterogeneous environments might be optimized with respect to the statistics and dynamics of environmental composition. The developed formalism stresses the importance of characterizing nonspecific interactions to understand function in signaling pathways.

  14. Comparing perceived self-management practices of adult type 2 diabetic patients after completion of a structured ADA certified diabetes self-management education program with unstructured individualized nurse practitioner led diabetes self-management education.

    PubMed

    Wooley, Dennis S; Kinner, Tracy J

    2016-11-01

    The purpose was to compare perceived self-management practices of adult type 2 diabetic patients after completing an American Diabetes Association (ADA) certified diabetes self-management education (DSME) program with unstructured individualized nurse practitioner led DSME. Demographic questions and the Self-Care Inventory-Revised (SCIR) were given to two convenience sample patient groups comprising a formal DSME program group and a group within a clinical setting who received informal and unstructured individual education during patient encounters. A t-test was executed between the formal ADA certified education sample and the informal sample's SCI-R individual scores. A second t-test was performed between the two samples' SCI-R mean scores. A t-test determined no statistically significant difference between the formal ADA structured education and informal education samples' SCI-R individual scores. There was not a statistically significant difference between the samples' SCI-R mean scores. The study results suggest that there are not superior DSME settings and instructional approaches. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Statistical inference for tumor growth inhibition T/C ratio.

    PubMed

    Wu, Jianrong

    2010-09-01

    The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.

  16. Geospatial and Remote Sensing-based Indicators of Settlement Type---Differentiating Informal and Formal Settlements in Guatemala City

    NASA Astrophysics Data System (ADS)

    Owen, Karen K.

    This research addresses the need for reliable, repeatable, quantitative measures to differentiate informal (slum) from formal (planned) settlements using commercial very high resolution imagery and elevation data. Measuring the physical, spatial and spectral qualities of informal settlements is an important precursor for evaluating success toward improving the lives of 100 million slum dwellers worldwide, as pledged by the United Nations Millennium Development Goal Target 7D. A variety of measures were tested based on surface material spectral properties, texture, built-up structure, road network accessibility, and geomorphology from twelve communities in Guatemala City to reveal statistically significant differences between informal and formal settlements that could be applied to other parts of the world without the need for costly or dangerous field surveys. When information from satellite imagery is constrained to roads and residential boundaries, a more precise understanding of human habitation is produced. A classification and regression tree (CART) approach and linear discriminant function analysis enabled a variable dimensionality reduction from the original 23 to 6 variables that are sufficient to differentiate a settlement as informal or formal. The results demonstrate that the entropy texture of roads, the degree of asphalt road surface, the vegetation patch compactness and patch size, the percent of bare soil land cover, the geomorphic profile convexity of the terrain, and the road density distinguish informal from formal settlements with 87--92% accuracy when results are cross-validated. The variables with highest contribution to model outcome that are common to both approaches are entropy texture of roads, vegetation patch size, and vegetation compactness suggesting that road texture, surface materials and vegetation provide the necessary characteristics to distinguish the level of informality of a settlement. The results will assist urban planners and settlement analysts who must process vast amounts of imagery worldwide, enabling them to report annually on slum conditions. An added benefit is the ability to use the measures in data-poor regions of the world without field surveys.

  17. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  18. An exploration of student midwives' language to describe non-formal learning in professional practice.

    PubMed

    Finnerty, Gina; Pope, Rosemary

    2005-05-01

    The essence of non-formal learning in midwifery practice has not been previously explored. This paper provides an in-depth analysis of the language of a sample of student midwives' descriptions of their practice learning in a range of clinical settings. The students submitted audio-diaries as part of a national study (Pope, R., Graham. L., Finnerty. G., Magnusson, C. 2003. An investigation of the preparation and assessment for midwifery practice within a range of settings. Project Report. University of Surrey). Participants detailed their learning activities and support obtained whilst working with their named mentors for approximately 10 days or shifts. The rich audio-diary data have been analysed using Discourse Analysis. A typology of non-formal learning (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136) has been used to provide a framework for the analysis. Non-formal learning is defined as any learning which does not take place within a formally organised learning programme (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136). Findings indicate that fear and ambiguity hindered students' learning. Recommendations include the protection of time by mentors within the clinical curriculum to guide and supervise students in both formal and non-formal elements of midwifery practice. This paper will explore the implications of the findings for practice-based education.

  19. Bayesian test for colocalisation between pairs of genetic association studies using summary statistics.

    PubMed

    Giambartolomei, Claudia; Vukcevic, Damjan; Schadt, Eric E; Franke, Lude; Hingorani, Aroon D; Wallace, Chris; Plagnol, Vincent

    2014-05-01

    Genetic association studies, in particular the genome-wide association study (GWAS) design, have provided a wealth of novel insights into the aetiology of a wide range of human diseases and traits, in particular cardiovascular diseases and lipid biomarkers. The next challenge consists of understanding the molecular basis of these associations. The integration of multiple association datasets, including gene expression datasets, can contribute to this goal. We have developed a novel statistical methodology to assess whether two association signals are consistent with a shared causal variant. An application is the integration of disease scans with expression quantitative trait locus (eQTL) studies, but any pair of GWAS datasets can be integrated in this framework. We demonstrate the value of the approach by re-analysing a gene expression dataset in 966 liver samples with a published meta-analysis of lipid traits including >100,000 individuals of European ancestry. Combining all lipid biomarkers, our re-analysis supported 26 out of 38 reported colocalisation results with eQTLs and identified 14 new colocalisation results, hence highlighting the value of a formal statistical test. In three cases of reported eQTL-lipid pairs (SYPL2, IFT172, TBKBP1) for which our analysis suggests that the eQTL pattern is not consistent with the lipid association, we identify alternative colocalisation results with SORT1, GCKR, and KPNB1, indicating that these genes are more likely to be causal in these genomic intervals. A key feature of the method is the ability to derive the output statistics from single SNP summary statistics, hence making it possible to perform systematic meta-analysis type comparisons across multiple GWAS datasets (implemented online at http://coloc.cs.ucl.ac.uk/coloc/). Our methodology provides information about candidate causal genes in associated intervals and has direct implications for the understanding of complex diseases as well as the design of drugs to target disease pathways.

  20. Local understandings of conservation in southeastern Mexico and their implications for community-based conservation as an alternative paradigm.

    PubMed

    Reyes-Garcia, Victoria; Ruiz-Mallen, Isabel; Porter-Bolland, Luciana; Garcia-Frapolli, Eduardo; Ellis, Edward A; Mendez, Maria-Elena; Pritchard, Diana J; Sanchez-Gonzalez, María-Consuelo

    2013-08-01

    Since the 1990s national and international programs have aimed to legitimize local conservation initiatives that might provide an alternative to the formal systems of state-managed or otherwise externally driven protected areas. We used discourse analysis (130 semistructured interviews with key informants) and descriptive statistics (679 surveys) to compare local perceptions of and experiences with state-driven versus community-driven conservation initiatives. We conducted our research in 6 communities in southeastern Mexico. Formalization of local conservation initiatives did not seem to be based on local knowledge and practices. Although interviewees thought community-based initiatives generated less conflict than state-managed conservation initiatives, the community-based initiatives conformed to the biodiversity conservation paradigm that emphasizes restricted use of and access to resources. This restrictive approach to community-based conservation in Mexico, promoted through state and international conservation organizations, increased the area of protected land and had local support but was not built on locally relevant and multifunctional landscapes, a model that community-based conservation is assumed to advance. © 2013 Society for Conservation Biology.

  1. Analyzing Seasonal Variations in Suicide With Fourier Poisson Time-Series Regression: A Registry-Based Study From Norway, 1969-2007.

    PubMed

    Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo

    2015-08-01

    Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Longitudinal costs of caring for people with Alzheimer's disease.

    PubMed

    Gillespie, Paddy; O'Shea, Eamon; Cullinan, John; Buchanan, Jacqui; Bobula, Joel; Lacey, Loretto; Gallagher, Damien; Mhaolain, Aine Ni; Lawlor, Brian

    2015-05-01

    There has been an increasing interest in the relationship between severity of disease and costs in the care of people with dementia. Much of the current evidence is based on cross-sectional data, suggesting the need to examine trends over time for this important and growing cohort of the population. This paper estimates resource use and costs of care based on longitudinal data for 72 people with dementia in Ireland. Data were collected from the Enhancing Care in Alzheimer's Disease (ECAD) study at two time points: baseline and follow-up, two years later. Patients' dependence on others was measured using the Dependence Scale (DS), while patient function was measured using the Disability Assessment for Dementia (DAD) scale. Univariate and multivariate analysis were used to explore the effects of a range of variables on formal and informal care costs. Total costs of formal and informal care over six months rose from €9,266 (Standard Deviation (SD): 12,947) per patient at baseline to €21,266 (SD: 26,883) at follow-up, two years later. This constituted a statistically significant (p = 0.0014) increase in costs over time, driven primarily by an increase in estimated informal care costs. In the multivariate analysis, a one-point increase in the DS score, that is a one-unit increase in patient's dependence on others, was associated with a 19% increase in total costs (p = 0.0610). Higher levels of dependence in people with Alzheimer's disease are significantly associated with increased costs of informal care as the disease progresses. Formal care services did not respond to increased dependence in people with dementia, leaving it to families to fill the caring gap, mainly through increased supervision with the progress of disease.

  3. Treatment of uncertainties in the IPCC: a philosophical analysis

    NASA Astrophysics Data System (ADS)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify treating uncertainty along those two dimensions, and indicate how this can be avoided.

  4. Home care in Austria: the interplay of family orientation, cash-for-care and migrant care.

    PubMed

    Österle, August; Bauer, Gudrun

    2012-05-01

    This article discusses the development of the home care sector in Austria. It analyses what impacts the interplay of the traditional family orientation to care, a universal cash-for-care scheme (reaching about 5% of the population) and a growing migrant care sector have on formal home care in Austria. The article is based on an analysis of research papers, policy documents and statistical data covering the period from the introduction of the cash-for-care scheme in 1993 up to 2011. Some authors have argued that generous cash benefits with no direct link to service use - as in the case of Austria - limit the development of home care, particularly in countries with a traditionally strong family orientation towards long-term care. Additionally, a tradition of family care and an emphasis on cash benefits may be conducive to the employment of migrant carers in private households, as a potential substitute for both family care and formal care. Despite this context, Austria has seen a substantial increase in formal home care over the past two decades. This has been driven by clients using their increased purchasing power and by policy priorities emphasising the extension of home care. Migrant care work was regularised in 2007, and the analysis suggests that while migrant care has usually worked as a substitute for other care arrangements, migrant care can also become a more integral element of care schemes. The article concludes that family orientation, unconditional cash benefits and the use of migrant carers do not necessarily preclude the development of a strong social service sector. However, there is a risk that budgetary limitations will primarily affect social service development. © 2011 Blackwell Publishing Ltd.

  5. 11.2 YIP Human In the Loop Statistical RelationalLearners

    DTIC Science & Technology

    2017-10-23

    learning formalisms including inverse reinforcement learning [4] and statistical relational learning [7, 5, 8]. We have also applied our algorithms in...one introduced for label preferences. 4 Figure 2: Active Advice Seeking for Inverse Reinforcement Learning. active advice seeking is in selecting the...learning tasks. 1.2.1 Sequential Decision-Making Our previous work on advice for inverse reinforcement learning (IRL) defined advice as action

  6. 49 CFR 236.923 - Task analysis and basic requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...

  7. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  8. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  9. Thomas-Fermi model for a bulk self-gravitating stellar object in two dimensions

    NASA Astrophysics Data System (ADS)

    De, Sanchari; Chakrabarty, Somenath

    2015-09-01

    In this article we have solved a hypothetical problem related to the stability and gross properties of two-dimensional self-gravitating stellar objects using the Thomas-Fermi model. The formalism presented here is an extension of the standard three-dimensional problem discussed in the book on statistical physics, Part I by Landau and Lifshitz. Further, the formalism presented in this article may be considered a class problem for post-graduate-level students of physics or may be assigned as a part of their dissertation project.

  10. Systematic review of learning curves for minimally invasive abdominal surgery: a review of the methodology of data collection, depiction of outcomes, and statistical analysis.

    PubMed

    Harrysson, Iliana J; Cook, Jonathan; Sirimanna, Pramudith; Feldman, Liane S; Darzi, Ara; Aggarwal, Rajesh

    2014-07-01

    To determine how minimally invasive surgical learning curves are assessed and define an ideal framework for this assessment. Learning curves have implications for training and adoption of new procedures and devices. In 2000, a review of the learning curve literature was done by Ramsay et al and it called for improved reporting and statistical evaluation of learning curves. Since then, a body of literature is emerging on learning curves but the presentation and analysis vary. A systematic search was performed of MEDLINE, EMBASE, ISI Web of Science, ERIC, and the Cochrane Library from 1985 to August 2012. The inclusion criteria are minimally invasive abdominal surgery formally analyzing the learning curve and English language. 592 (11.1%) of the identified studies met the selection criteria. Time is the most commonly used proxy for the learning curve (508, 86%). Intraoperative outcomes were used in 316 (53%) of the articles, postoperative outcomes in 306 (52%), technical skills in 102 (17%), and patient-oriented outcomes in 38 (6%) articles. Over time, there was evidence of an increase in the relative amount of laparoscopic and robotic studies (P < 0.001) without statistical evidence of a change in the complexity of analysis (P = 0.121). Assessment of learning curves is needed to inform surgical training and evaluate new clinical procedures. An ideal analysis would account for the degree of complexity of individual cases and the inherent differences between surgeons. There is no single proxy that best represents the success of surgery, and hence multiple outcomes should be collected.

  11. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  12. Unified quantitative characterization of epithelial tissue development

    PubMed Central

    Guirao, Boris; Rigaud, Stéphane U; Bosveld, Floris; Bailles, Anaïs; López-Gay, Jesús; Ishihara, Shuji; Sugimura, Kaoru

    2015-01-01

    Understanding the mechanisms regulating development requires a quantitative characterization of cell divisions, rearrangements, cell size and shape changes, and apoptoses. We developed a multiscale formalism that relates the characterizations of each cell process to tissue growth and morphogenesis. Having validated the formalism on computer simulations, we quantified separately all morphogenetic events in the Drosophila dorsal thorax and wing pupal epithelia to obtain comprehensive statistical maps linking cell and tissue scale dynamics. While globally cell shape changes, rearrangements and divisions all significantly participate in tissue morphogenesis, locally, their relative participations display major variations in space and time. By blocking division we analyzed the impact of division on rearrangements, cell shape changes and tissue morphogenesis. Finally, by combining the formalism with mechanical stress measurement, we evidenced unexpected interplays between patterns of tissue elongation, cell division and stress. Our formalism provides a novel and rigorous approach to uncover mechanisms governing tissue development. DOI: http://dx.doi.org/10.7554/eLife.08519.001 PMID:26653285

  13. Ontological analysis of SNOMED CT.

    PubMed

    Héja, Gergely; Surján, György; Varga, Péter

    2008-10-27

    SNOMED CT is the most comprehensive medical terminology. However, its use for intelligent services based on formal reasoning is questionable. The analysis of the structure of SNOMED CT is based on the formal top-level ontology DOLCE. The analysis revealed several ontological and knowledge-engineering errors, the most important are errors in the hierarchy (mostly from an ontological point of view, but also regarding medical aspects) and the mixing of subsumption relations with other types (mostly 'part of'). The found errors impede formal reasoning. The paper presents a possible way to correct these problems.

  14. Formal Analysis of Extended Well-Clear Boundaries for Unmanned Aircraft

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Narkawicz, Anthony

    2016-01-01

    This paper concerns the application of formal methods to the definition of a detect and avoid concept for unmanned aircraft systems (UAS). In particular, it illustrates how formal analysis was used to explain and correct unexpected behaviors of the logic that issues alerts when two aircraft are predicted not to be well clear from one another. As a result of this analysis, a recommendation was proposed to, and subsequently adopted by, the US standards organization that defines the minimum operational requirements for the UAS detect and avoid concept.

  15. Quantum-like model for the adaptive dynamics of the genetic regulation of E. coli's metabolism of glucose/lactose.

    PubMed

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2012-06-01

    We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.

  16. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  17. Equitability, mutual information, and the maximal information coefficient.

    PubMed

    Kinney, Justin B; Atwal, Gurinder S

    2014-03-04

    How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical "equitability" has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a self-consistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518-1524], which proposed an alternative definition of equitability and introduced a new statistic, the "maximal information coefficient" (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.

  18. Formal Assurance Arguments: A Solution In Search of a Problem?

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.

    2015-01-01

    An assurance case comprises evidence and argument showing how that evidence supports assurance claims (e.g., about safety or security). It is unsurprising that some computer scientists have proposed formalizing assurance arguments: most associate formality with rigor. But while engineers can sometimes prove that source code refines a formal specification, it is not clear that formalization will improve assurance arguments or that this benefit is worth its cost. For example, formalization might reduce the benefits of argumentation by limiting the audience to people who can read formal logic. In this paper, we present (1) a systematic survey of the literature surrounding formal assurance arguments, (2) an analysis of errors that formalism can help to eliminate, (3) a discussion of existing evidence, and (4) suggestions for experimental work to definitively answer the question.

  19. What constitutes a good hand offs in the emergency department: a patient's perspective.

    PubMed

    Downey, La Vonne; Zun, Leslie; Burke, Trena

    2013-01-01

    The aim is to determine, from the patient's perspective, what constitutes a good hand-off procedure in the emergency department (ED). The secondary purpose is to evaluate what impact a formalized hand-off had on patient knowledge, throughput and customer service This study used a randomized controlled clinical trial involving two unique hand-off approaches and a convenience sample. The study alternated between the current hand-off process that documented the process but not specific elements (referred to as the informal process) to one using the IPASS the BATON process (considered the formal process). Consenting patients completed a 12-question validated questionnaire on how the process was perceived by patients and about their understanding why they waited in the ED. Statistical analysis using SPSS calculated descriptive frequencies and t-tests. In total 107 patients were enrolled: 50 in the informal and 57 in the formal group. Most patients had positive answers to the customer survey. There were significant differences between formal and informal groups: recalling the oncoming and outgoing physician coming to the patient's bed (p = 0.000), with more formal group recalling that than informal group patients; the oncoming physician introducing him/herself (p = 0.01), with more from the formal group answering yes and the physician discussing tests and implications with formal group patients (p = 0.02). This study was done at an urban inner city ED, a fact that may have skewed its results. A comparison of suburban and rural EDs would make the results stronger. It also reflected a very high level of customer satisfaction within the ED. This lack of variance may have meant that the correlation between customer service and handoffs was missed or underrepresented. There was no codified observation of either those using the IPASS the BATON script or those using informal procedures, so no comparison of level and types of information given between the two groups was done. There could have been a bias of those attending who had internalized the IPASS the BATON procedures and used them even when they were assigned to the informal group. A hand off from one physician to the next in the emergency department is best done using a formalized process. IPASS the BATON is a useful tool for hand off in the ED in part because it involved the patient in the process. The formal hand off increased communication between patient and doctor as its use increased the patient's opportunity to ask and respond to questions. The researchers evaluated an ED physician specific hand-off process and illustrate the value and impact of involving patients in the hand-off process.

  20. Critical Analysis of the Mathematical Formalism of Theoretical Physics. II. Foundations of Vector Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2014-03-01

    A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.

  1. Quantum Statistics of the Toda Oscillator in the Wigner Function Formalism

    NASA Astrophysics Data System (ADS)

    Vojta, Günter; Vojta, Matthias

    Classical and quantum mechanical Toda systems (Toda molecules, Toda lattices, Toda quantum fields) recently found growing interest as nonlinear systems showing solitons and chaos. In this paper the statistical thermodynamics of a system of quantum mechanical Toda oscillators characterized by a potential energy V(q) = Vo cos h q is treated within the Wigner function formalism (phase space formalism of quantum statistics). The partition function is given as a Wigner- Kirkwood series expansion in terms of powers of h2 (semiclassical expansion). The partition function and all thermodynamic functions are written, with considerable exactness, as simple closed expressions containing only the modified Hankel functions Ko and K1 of the purely imaginary argument i with = Vo/kT.Translated AbstractQuantenstatistik des Toda-Oszillators im Formalismus der Wigner-FunktionKlassische und quantenmechanische Toda-Systeme (Toda-Moleküle, Toda-Gitter, Toda-Quantenfelder) haben als nichtlineare Systeme mit Solitonen und Chaos in jüngster Zeit zunehmend an Interesse gewonnen. Wir untersuchen die statistische Thermodynamik eines Systems quantenmechanischer Toda-Oszillatoren, die durch eine potentielle Energie der Form V(q) = Vo cos h q charakterisiert sind, im Formalismus der Wigner-Funktion (Phasenraum-Formalismus der Quantenstatistik). Die Zustandssumme wird als Wigner-Kirkwood-Reihe nach Potenzen von h2 (semiklassische Entwicklung) dargestellt, und aus ihr werden die thermodynamischen Funktionen berechnet. Sämtliche Funktionen sind durch einfache geschlossene Formeln allein mit den modifizierten Hankel-Funktionen Ko und K1 des rein imaginären Arguments i mit = Vo/kT mit großer Genauigkeit darzustellen.

  2. Who serves the urban poor? A geospatial and descriptive analysis of health services in slum settlements in Dhaka, Bangladesh.

    PubMed

    Adams, Alayne M; Islam, Rubana; Ahmed, Tanvir

    2015-03-01

    In Bangladesh, the health risks of unplanned urbanization are disproportionately shouldered by the urban poor. At the same time, affordable formal primary care services are scarce, and what exists is almost exclusively provided by non-government organizations (NGOs) working on a project basis. So where do the poor go for health care? A health facility mapping of six urban slum settlements in Dhaka was undertaken to explore the configuration of healthcare services proximate to where the poor reside. Three methods were employed: (1) Social mapping and listing of all Health Service Delivery Points (HSDPs); (2) Creation of a geospatial map including Global Positioning System (GPS) co-ordinates of all HSPDs in the six study areas and (3) Implementation of a facility survey of all HSDPs within six study areas. Descriptive statistics are used to examine the number, type and concentration of service provider types, as well as indicators of their accessibility in terms of location and hours of service. A total of 1041 HSDPs were mapped, of which 80% are privately operated and the rest by NGOs and the public sector. Phamacies and non-formal or traditional doctors make up 75% of the private sector while consultation chambers account for 20%. Most NGO and Urban Primary Health Care Project (UPHCP) static clinics are open 5-6 days/week, but close by 4-5 pm in the afternoon. Evening services are almost exclusively offered by private HSDPs; however, only 37% of private sector health staff possess some kind of formal medical qualification. This spatial analysis of health service supply in poor urban settlements emphasizes the importance of taking the informal private sector into account in efforts to increase effective coverage of quality services. Features of informal private sector service provision that have facilitated market penetration may be relevant in designing formal services that better meet the needs of the urban poor. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2015; all rights reserved.

  3. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  4. The Statistical Mechanics of Solar Wind Hydroxylation at the Moon, Within Lunar Magnetic Anomalies, and at Phobos

    NASA Technical Reports Server (NTRS)

    Farrell, W. M.; Hurley, D. M.; Esposito, V. J.; Mclain, J. L.; Zimmerman, M. I.

    2017-01-01

    We present a new formalism to describe the outgassing of hydrogen initially implanted by the solar wind protons into exposed soils on airless bodies. The formalism applies a statistical mechanics approach similar to that applied recently to molecular adsorption onto activated surfaces. The key element enabling this formalism is the recognition that the interatomic potential between the implanted H and regolith-residing oxides is not of singular value but possess a distribution of trapped energy values at a given temperature, F(U,T). All subsequent derivations of the outward diffusion and H retention rely on the specific properties of this distribution. We find that solar wind hydrogen can be retained if there are sites in the implantation layer with activation energy values exceeding 0.5eV. We especially examine the dependence of H retention applying characteristic energy values found previously for irradiated silica and mature lunar samples. We also apply the formalism to two cases that differ from the typical solar wind implantation at the Moon. First, we test for a case of implantation in magnetic anomaly regions where significantly lower-energy ions of solar wind origin are expected to be incident with the surface. In magnetic anomalies, H retention is found to be reduced due to the reduced ion flux and shallower depth of implantation. Second, we also apply the model to Phobos where the surface temperature range is not as extreme as the Moon. We find the H atom retention in this second case is higher than the lunar case due to the reduced thermal extremes (that reduces outgassing).

  5. Horizon Entropy from Quantum Gravity Condensates.

    PubMed

    Oriti, Daniele; Pranzetti, Daniele; Sindoni, Lorenzo

    2016-05-27

    We construct condensate states encoding the continuum spherically symmetric quantum geometry of a horizon in full quantum gravity, i.e., without any classical symmetry reduction, in the group field theory formalism. Tracing over the bulk degrees of freedom, we show how the resulting reduced density matrix manifestly exhibits a holographic behavior. We derive a complete orthonormal basis of eigenstates for the reduced density matrix of the horizon and use it to compute the horizon entanglement entropy. By imposing consistency with the horizon boundary conditions and semiclassical thermodynamical properties, we recover the Bekenstein-Hawking entropy formula for any value of the Immirzi parameter. Our analysis supports the equivalence between the von Neumann (entanglement) entropy interpretation and the Boltzmann (statistical) one.

  6. Inflationary cosmology with Chaplygin gas in Palatini formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borowiec, Andrzej; Wojnar, Aneta; Stachowski, Aleksander

    2016-01-01

    We present a simple generalisation of the ΛCDM model which on the one hand reaches very good agreement with the present day experimental data and provides an internal inflationary mechanism on the other hand. It is based on Palatini modified gravity with quadratic Starobinsky term and generalized Chaplygin gas as a matter source providing, besides a current accelerated expansion, the epoch of endogenous inflation driven by type III freeze singularity. It follows from our statistical analysis that astronomical data favors negative value of the parameter coupling quadratic term into Einstein-Hilbert Lagrangian and as a consequence the bounce instead of initialmore » Big-Bang singularity is preferred.« less

  7. What Influences Mental Illness? Discrepancies Between Medical Education and Conception.

    PubMed

    Einstein, Evan Hy; Klepacz, Lidia

    2017-01-01

    This preliminary study examined the differences between what was taught during a formal medical education and medical students' and psychiatry residents' conceptions of notions regarding the causes and determinants of mental illness. The authors surveyed 74 medical students and 11 residents via convenience sampling. The survey contained 18 statements which were rated twice based on truthfulness in terms of a participant's formal education and conception, respectively. Descriptive statistics and a Wilcoxon signed rank test determined differences between education and conception. Results showed that students were less likely to perceive a neurotransmitter imbalance to cause mental illness, as opposed to what was emphasized during a formal medical education. Students and residents also understood the importance of factors such as systemic racism and socioeconomic status in the development of mental illness, which were factors that did not receive heavy emphasis during medical education. Furthermore, students and residents believed that not only did mental illnesses have nonuniform pathologies, but that the Diagnostic and Statistical Manual of Mental Disorders also had the propensity to sometimes arbitrarily categorize individuals with potentially negative consequences. If these notions are therefore part of students' and residents' conceptions, as well as documented in the literature, then it seems appropriate for medical education to be further developed to emphasize these ideas.

  8. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  9. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA

    NASA Astrophysics Data System (ADS)

    Coughlan, Michael R.

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  10. Variational Bayesian Parameter Estimation Techniques for the General Linear Model

    PubMed Central

    Starke, Ludger; Ostwald, Dirk

    2017-01-01

    Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572

  11. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA.

    PubMed

    Coughlan, Michael R

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  12. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  13. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  14. THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...

    EPA Pesticide Factsheets

    CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc

  15. Immersive Theater - a Proven Way to Enhance Learning Retention

    NASA Astrophysics Data System (ADS)

    Reiff, P. H.; Zimmerman, L.; Spillane, S.; Sumners, C.

    2014-12-01

    The portable immersive theater has gone from our first demonstration at fall AGU 2003 to a product offered by multiple companies in various versions to literally millions of users per year. As part of our NASA funded outreach program, we conducted a test of learning in a portable Discovery Dome as contrasted with learning the same materials (visuals and sound track) on a computer screen. We tested 200 middle school students (primarily underserved minorities). Paired t-tests and an independent t-test were used to compare the amount of learning that students achieved. Interest questionnaires were administered to participants in formal (public school) settings and focus groups were conducted in informal (museum camp and educational festival) settings. Overall results from the informal and formal educational setting indicated that there was a statistically significant increase in test scores after viewing We Choose Space. There was a statistically significant increase in test scores for students who viewed We Choose Space in the portable Discovery Dome (9.75) as well as with the computer (8.88). However, long-term retention of the material tested on the questionnaire indicated that for students who watched We Choose Space in the portable Discovery Dome, there was a statistically significant long-term increase in test scores (10.47), whereas, six weeks after learning on the computer, the improvements over the initial baseline (3.49) were far less and were not statistically significant. The test score improvement six weeks after learning in the dome was essentially the same as the post test immediately after watching the show, demonstrating virtually no loss of gained information in the six week interval. In the formal educational setting, approximately 34% of the respondents indicated that they wanted to learn more about becoming a scientist, while 35% expressed an interest in a career in space science. In the informal setting, 26% indicated that they were interested in pursuing a career in space science.

  16. Physiological time-series analysis: what does regularity quantify?

    NASA Technical Reports Server (NTRS)

    Pincus, S. M.; Goldberger, A. L.

    1994-01-01

    Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.

  17. On the Mathematical Consequences of Binning Spike Trains.

    PubMed

    Cessac, Bruno; Le Ny, Arnaud; Löcherbach, Eva

    2017-01-01

    We initiate a mathematical analysis of hidden effects induced by binning spike trains of neurons. Assuming that the original spike train has been generated by a discrete Markov process, we show that binning generates a stochastic process that is no longer Markov but is instead a variable-length Markov chain (VLMC) with unbounded memory. We also show that the law of the binned raster is a Gibbs measure in the DLR (Dobrushin-Lanford-Ruelle) sense coined in mathematical statistical mechanics. This allows the derivation of several important consequences on statistical properties of binned spike trains. In particular, we introduce the DLR framework as a natural setting to mathematically formalize anticipation, that is, to tell "how good" our nervous system is at making predictions. In a probabilistic sense, this corresponds to condition a process by its future, and we discuss how binning may affect our conclusions on this ability. We finally comment on the possible consequences of binning in the detection of spurious phase transitions or in the detection of incorrect evidence of criticality.

  18. Statistical properties of Galactic CMB foregrounds: dust and synchrotron

    NASA Astrophysics Data System (ADS)

    Kandel, D.; Lazarian, A.; Pogosyan, D.

    2018-07-01

    Recent Planck observations have revealed some of the important statistical properties of synchrotron and dust polarization, namely, the B to E mode power and temperature-E (TE) mode cross-correlation. In this paper, we extend our analysis in Kandel et al. that studied the B to E mode power ratio for polarized dust emission to include TE cross-correlation and develop an analogous formalism for synchrotron signal, all using a realistic model of magnetohydrodynamical turbulence. Our results suggest that the Planck results for both synchrotron and dust polarization can be understood if the turbulence in the Galaxy is sufficiently sub-Alfvénic. Making use of the observed poor magnetic field-density correlation, we show that the observed positive TE correlation for dust corresponds to our theoretical expectations. We also show how the B to E ratio as well as the TE cross-correlation can be used to study media magnetization, compressibility, and level of density-magnetic field correlation.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaurov, Alexander A., E-mail: kaurov@uchicago.edu

    The methods for studying the epoch of cosmic reionization vary from full radiative transfer simulations to purely analytical models. While numerical approaches are computationally expensive and are not suitable for generating many mock catalogs, analytical methods are based on assumptions and approximations. We explore the interconnection between both methods. First, we ask how the analytical framework of excursion set formalism can be used for statistical analysis of numerical simulations and visual representation of the morphology of ionization fronts. Second, we explore the methods of training the analytical model on a given numerical simulation. We present a new code which emergedmore » from this study. Its main application is to match the analytical model with a numerical simulation. Then, it allows one to generate mock reionization catalogs with volumes exceeding the original simulation quickly and computationally inexpensively, meanwhile reproducing large-scale statistical properties. These mock catalogs are particularly useful for cosmic microwave background polarization and 21 cm experiments, where large volumes are required to simulate the observed signal.« less

  20. Restorative Practices as Formal and Informal Education

    ERIC Educational Resources Information Center

    Carter, Candice C.

    2013-01-01

    This article reviews restorative practices (RP) as education in formal and informal contexts of learning that are fertile sites for cultivating peace. Formal practices involve instruction about response to conflict, while informal learning occurs beyond academic lessons. The research incorporated content analysis and a critical examination of the…

  1. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  2. Superstatistics of the Klein-Gordon equation in deformed formalism for modified Dirac delta distribution

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-04-01

    The Klein-Gordon equation is extended in the presence of an Aharonov-Bohm magnetic field for the Cornell potential and the corresponding wave functions as well as the spectra are obtained. After introducing the superstatistics in the statistical mechanics, we first derived the effective Boltzmann factor in the deformed formalism with modified Dirac delta distribution. We then use the concepts of the superstatistics to calculate the thermodynamics properties of the system. The well-known results are recovered by the vanishing of deformation parameter and some graphs are plotted for the clarity of our results.

  3. draco: Analysis and simulation of drift scan radio data

    NASA Astrophysics Data System (ADS)

    Shaw, J. Richard

    2017-12-01

    draco analyzes transit radio data with the m-mode formalism. It is telescope agnostic, and is used as part of the analysis and simulation pipeline for the CHIME (Canadian Hydrogen Intensity Mapping Experiment) telescope. It can simulate time stream data from maps of the sky (using the m-mode formalism) and add gain fluctuations and correctly correlated instrumental noise (i.e. Wishart distributed). Further, it can perform various cuts on the data and make maps of the sky from data using the m-mode formalism.

  4. Formalizing New Navigation Requirements for NASA's Space Shuttle

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  5. Learning Needs Analysis of Collaborative E-Classes in Semi-Formal Settings: The REVIT Example

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis

    2013-01-01

    Analysis, the first phase of the typical instructional design process, is often downplayed. This paper focuses on the analysis concerning a series of e-courses for collaborative adult education in semi-formal settings by reporting and generalizing results from the REVIT project. REVIT, an EU-funded research project, offered custom e-courses to…

  6. Pedagogical Basis of DAS Formalism in Engineering Education

    ERIC Educational Resources Information Center

    Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.

    2011-01-01

    The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…

  7. Developing an approach for teaching and learning about Lewis structures

    NASA Astrophysics Data System (ADS)

    Kaufmann, Ilana; Hamza, Karim M.; Rundgren, Carl-Johan; Eriksson, Lars

    2017-08-01

    This study explores first-year university students' reasoning as they learn to draw Lewis structures. We also present a theoretical account of the formal procedure commonly taught for drawing these structures. Students' discussions during problem-solving activities were video recorded and detailed analyses of the discussions were made through the use of practical epistemology analysis (PEA). Our results show that the formal procedure was central for drawing Lewis structures, but its use varied depending on situational aspects. Commonly, the use of individual steps of the formal procedure was contingent on experiences of chemical structures, and other information such as the characteristics of the problem given. The analysis revealed a number of patterns in how students constructed, checked and modified the structure in relation to the formal procedure and the situational aspects. We suggest that explicitly teaching the formal procedure as a process of constructing, checking and modifying might be helpful for students learning to draw Lewis structures. By doing so, the students may learn to check the accuracy of the generated structure not only in relation to the octet rule and formal charge, but also to other experiences that are not explicitly included in the formal procedure.

  8. Assessment and statistics of surgically induced astigmatism.

    PubMed

    Naeser, Kristian

    2008-05-01

    The aim of the thesis was to develop methods for assessment of surgically induced astigmatism (SIA) in individual eyes, and in groups of eyes. The thesis is based on 12 peer-reviewed publications, published over a period of 16 years. In these publications older and contemporary literature was reviewed(1). A new method (the polar system) for analysis of SIA was developed. Multivariate statistical analysis of refractive data was described(2-4). Clinical validation studies were performed. The description of a cylinder surface with polar values and differential geometry was compared. The main results were: refractive data in the form of sphere, cylinder and axis may define an individual patient or data set, but are unsuited for mathematical and statistical analyses(1). The polar value system converts net astigmatisms to orthonormal components in dioptric space. A polar value is the difference in meridional power between two orthogonal meridians(5,6). Any pair of polar values, separated by an arch of 45 degrees, characterizes a net astigmatism completely(7). The two polar values represent the net curvital and net torsional power over the chosen meridian(8). The spherical component is described by the spherical equivalent power. Several clinical studies demonstrated the efficiency of multivariate statistical analysis of refractive data(4,9-11). Polar values and formal differential geometry describe astigmatic surfaces with similar concepts and mathematical functions(8). Other contemporary methods, such as Long's power matrix, Holladay's and Alpins' methods, Zernike(12) and Fourier analyses(8), are correlated to the polar value system. In conclusion, analysis of SIA should be performed with polar values or other contemporary component systems. The study was supported by Statens Sundhedsvidenskabeligt Forskningsråd, Cykelhandler P. Th. Rasmussen og Hustrus Mindelegat, Hotelejer Carl Larsen og Hustru Nicoline Larsens Mindelegat, Landsforeningen til Vaern om Synet, Forskningsinitiativet for Arhus Amt, Alcon Denmark, and Desirée and Niels Ydes Fond.

  9. COgnitive behavioural therapy versus standardised medical care for adults with Dissociative non-Epileptic Seizures (CODES): statistical and economic analysis plan for a randomised controlled trial.

    PubMed

    Robinson, Emily J; Goldstein, Laura H; McCrone, Paul; Perdue, Iain; Chalder, Trudie; Mellers, John D C; Richardson, Mark P; Murray, Joanna; Reuber, Markus; Medford, Nick; Stone, Jon; Carson, Alan; Landau, Sabine

    2017-06-06

    Dissociative seizures (DSs), also called psychogenic non-epileptic seizures, are a distressing and disabling problem for many patients in neurological settings with high and often unnecessary economic costs. The COgnitive behavioural therapy versus standardised medical care for adults with Dissociative non-Epileptic Seizures (CODES) trial is an evaluation of a specifically tailored psychological intervention with the aims of reducing seizure frequency and severity and improving psychological well-being in adults with DS. The aim of this paper is to report in detail the quantitative and economic analysis plan for the CODES trial, as agreed by the trial steering committee. The CODES trial is a multicentre, pragmatic, parallel group, randomised controlled trial performed to evaluate the clinical effectiveness and cost-effectiveness of 13 sessions of cognitive behavioural therapy (CBT) plus standardised medical care (SMC) compared with SMC alone for adult outpatients with DS. The objectives and design of the trial are summarised, and the aims and procedures of the planned analyses are illustrated. The proposed analysis plan addresses statistical considerations such as maintaining blinding, monitoring adherence with the protocol, describing aspects of treatment and dealing with missing data. The formal analysis approach for the primary and secondary outcomes is described, as are the descriptive statistics that will be reported. This paper provides transparency to the planned inferential analyses for the CODES trial prior to the extraction of outcome data. It also provides an update to the previously published trial protocol and guidance to those conducting similar trials. ISRCTN registry ISRCTN05681227 (registered on 5 March 2014); ClinicalTrials.gov NCT02325544 (registered on 15 December 2014).

  10. The Epidemiology and Associated Phenomenology of Formal Thought Disorder: A Systematic Review

    PubMed Central

    Roche, Eric; Creed, Lisa; MacMahon, Donagh; Brennan, Daria; Clarke, Mary

    2015-01-01

    Background: Authors of the Diagnostic and Statistical Manual, Fifth Edition (DSM-V) have recommended to “integrate dimensions into clinical practice.” The epidemiology and associated phenomenology of formal thought disorder (FTD) have been described but not reviewed. We aimed to carry out a systematic review of FTD to this end. Methods: A systematic review of FTD literature, from 1978 to 2013, using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Results: A total of 881 abstracts were reviewed and 120 articles met inclusion criteria; articles describing FTD factor structure (n = 15), prevalence and longitudinal course (n = 41), role in diagnosis (n = 22), associated clinical variables (n = 56), and influence on outcome (n = 35) were included. Prevalence estimates for FTD in psychosis range from 5% to 91%. Dividing FTD into domains, by factor analysis, can accurately identify 91% of psychotic diagnoses. FTD is associated with increased clinical severity. Poorer outcomes are predicted by negative thought disorder, more so than the typical construct of “disorganized speech.” Conclusion: FTD is a common symptom of psychosis and may be considered a marker of illness severity. Detailed dimensional assessment of FTD can clarify diagnosis and may help predict prognosis. PMID:25180313

  11. Tiger in the fault tree jungle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubel, P.

    1976-01-01

    There is yet little evidence of serious efforts to apply formal reliability analysis methods to evaluate, or even to identify, potential common-mode failures (CMF) of reactor safeguard systems. The prospects for event logic modeling in this regard are examined by the primitive device of reviewing actual CMF experience in terms of what the analyst might have perceived a priori. Further insights of the probability and risks aspects of CMFs are sought through consideration of three key likelihood factors: (1) prior probability of cause ever existing, (2) opportunities for removing cause, and (3) probability that a CMF cause will be activatedmore » by conditions associated with a real system challenge. It was concluded that the principal needs for formal logical discipline in the endeavor to decrease CMF-related risks are to discover and to account for strong ''energetic'' dependency couplings that could arise in the major accidents usually classed as ''hypothetical.'' This application would help focus research, design and quality assurance efforts to cope with major CMF causes. But without extraordinary challenges to the reactor safeguard systems, there must continue to be virtually no statistical evidence pertinent to that class of failure dependencies.« less

  12. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    NASA Astrophysics Data System (ADS)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  13. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  14. Segmentation of fluorescence microscopy images for quantitative analysis of cell nuclear architecture.

    PubMed

    Russell, Richard A; Adams, Niall M; Stephens, David A; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S

    2009-04-22

    Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments.

  15. Segmentation of Fluorescence Microscopy Images for Quantitative Analysis of Cell Nuclear Architecture

    PubMed Central

    Russell, Richard A.; Adams, Niall M.; Stephens, David A.; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S.

    2009-01-01

    Abstract Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments. PMID:19383481

  16. Formal and Informal Learning and First-Year Psychology Students’ Development of Scientific Thinking: A Two-Wave Panel Study

    PubMed Central

    Soyyılmaz, Demet; Griffin, Laura M.; Martín, Miguel H.; Kucharský, Šimon; Peycheva, Ekaterina D.; Vaupotič, Nina; Edelsbrunner, Peter A.

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students’ development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students’ need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students’ learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students’ scientific thinking. PMID:28239363

  17. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    NASA Astrophysics Data System (ADS)

    dell'Anno, Fabio; de Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n -mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherence and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [

    F. Dell’Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)
    ], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.

  18. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n-mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherencemore » and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [F. Dell'Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.« less

  19. Formal and Informal Learning and First-Year Psychology Students' Development of Scientific Thinking: A Two-Wave Panel Study.

    PubMed

    Soyyılmaz, Demet; Griffin, Laura M; Martín, Miguel H; Kucharský, Šimon; Peycheva, Ekaterina D; Vaupotič, Nina; Edelsbrunner, Peter A

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students' development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students' need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students' learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students' scientific thinking.

  20. A spatial exploration of informal trail networks within Great Falls Park, VA

    USGS Publications Warehouse

    Wimpey, Jeremy; Marion, Jeffrey L.

    2011-01-01

    Informal (visitor-created) trails represent a threat to the natural resources of protected natural areas around the globe. These trails can remove vegetation, displace wildlife, alter hydrology, alter habitat, spread invasive species, and fragment landscapes. This study examines informal and formal trails within Great Falls Park, VA, a sub-unit of the George Washington Memorial Parkway, managed by the U.S. National Park Service. This study sought to answer three specific questions: 1) Are the physical characteristics and topographic alignments of informal trails significantly different from formal trails, 2) Can landscape fragmentation metrics be used to summarize the relative impacts of formal and informal trail networks on a protected natural area? and 3) What can we learn from examining the spatial distribution of the informal trails within protected natural areas? Statistical comparisons between formal and informal trails in this park indicate that informal trails have less sustainable topographic alignments than their formal counterparts. Spatial summaries of the lineal and areal extent and fragmentation associated with the trail networks by park management zones compare park management goals to the assessed attributes. Hot spot analyses highlight areas of high trail density within the park and findings provide insights regarding potential causes for development of dense informal trail networks.

  1. Nurse manager succession planning: A cost-benefit analysis.

    PubMed

    Phillips, Tracy; Evans, Jennifer L; Tooley, Stephanie; Shirey, Maria R

    2018-03-01

    This commentary presents a cost-benefit analysis to advocate for the use of succession planning to mitigate the problems ensuing from nurse manager turnover. An estimated 75% of nurse managers will leave the workforce by 2020. Many benefits are associated with proactively identifying and developing internal candidates. Fewer than 7% of health care organisations have implemented formal leadership succession planning programmes. A cost-benefit analysis of a formal succession-planning programme from one hospital illustrates the benefits of the programme in their organisation and can be replicated easily. Assumptions of nursing manager succession planning cost-benefit analysis are identified and discussed. The succession planning exemplar demonstrates the integration of cost-benefit analysis principles. Comparing the costs of a formal nurse manager succession planning strategy with the status quo results in a positive cost-benefit ratio. The implementation of a formal nurse manager succession planning programme effectively reduces replacement costs and time to transition into the new role. This programme provides an internal pipeline of future leaders who will be more successful than external candidates. Using an actual cost-benefit analysis equips nurse managers with valuable evidence depicting succession planning as a viable business strategy. © 2017 John Wiley & Sons Ltd.

  2. EFL Teachers' Formal Assessment Practices Based on Exam Papers

    ERIC Educational Resources Information Center

    Kiliçkaya, Ferit

    2016-01-01

    This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…

  3. A global estimate of the Earth's magnetic crustal thickness

    NASA Astrophysics Data System (ADS)

    Vervelidou, Foteini; Thébault, Erwan

    2014-05-01

    The Earth's lithosphere is considered to be magnetic only down to the Curie isotherm. Therefore the Curie isotherm can, in principle, be estimated by analysis of magnetic data. Here, we propose such an analysis in the spectral domain by means of a newly introduced regional spatial power spectrum. This spectrum is based on the Revised Spherical Cap Harmonic Analysis (R-SCHA) formalism (Thébault et al., 2006). We briefly discuss its properties and its relationship with the Spherical Harmonic spatial power spectrum. This relationship allows us to adapt any theoretical expression of the lithospheric field power spectrum expressed in Spherical Harmonic degrees to the regional formulation. We compared previously published statistical expressions (Jackson, 1994 ; Voorhies et al., 2002) to the recent lithospheric field models derived from the CHAMP and airborne measurements and we finally developed a new statistical form for the power spectrum of the Earth's magnetic lithosphere that we think provides more consistent results. This expression depends on the mean magnetization, the mean crustal thickness and a power law value that describes the amount of spatial correlation of the sources. In this study, we make a combine use of the R-SCHA surface power spectrum and this statistical form. We conduct a series of regional spectral analyses for the entire Earth. For each region, we estimate the R-SCHA surface power spectrum of the NGDC-720 Spherical Harmonic model (Maus, 2010). We then fit each of these observational spectra to the statistical expression of the power spectrum of the Earth's lithosphere. By doing so, we estimate the large wavelengths of the magnetic crustal thickness on a global scale that are not accessible directly from the magnetic measurements due to the masking core field. We then discuss these results and compare them to the results we obtained by conducting a similar spectral analysis, but this time in the cartesian coordinates, by means of a published statistical expression (Maus et al., 1997). We also compare our results to crustal thickness global maps derived by means of additional geophysical data (Purucker et al., 2002).

  4. Matching biomedical ontologies based on formal concept analysis.

    PubMed

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.

  5. Who Cares? Infant Educators' Responses to Professional Discourses of Care

    ERIC Educational Resources Information Center

    Davis, Belinda; Degotardi, Sheila

    2015-01-01

    This paper explores the construction of "care" in early childhood curriculum and practice. An increasing number of infants are attending formal early childhood settings in Australia (Australian Bureau of Statistics, 2011. "Childhood education and care, Australia, June 2011." (4402.0). Retrieved from…

  6. Tertiary Education and Training in Australia, 2010

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2012

    2012-01-01

    This publication presents information on tertiary education and training during 2010, including statistics on participation and outcomes. The definition of tertiary education and training adopted for this publication is formal study in vocational education and training (VET) and higher education, including enrolments in Australian Qualifications…

  7. Large deviation principle at work: Computation of the statistical properties of the exact one-point aperture mass

    NASA Astrophysics Data System (ADS)

    Reimberg, Paulo; Bernardeau, Francis

    2018-01-01

    We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.

  8. Apes are intuitive statisticians.

    PubMed

    Rakoczy, Hannes; Clüver, Annette; Saucke, Liane; Stoffregen, Nicole; Gräbener, Alice; Migura, Judith; Call, Josep

    2014-04-01

    Inductive learning and reasoning, as we use it both in everyday life and in science, is characterized by flexible inferences based on statistical information: inferences from populations to samples and vice versa. Many forms of such statistical reasoning have been found to develop late in human ontogeny, depending on formal education and language, and to be fragile even in adults. New revolutionary research, however, suggests that even preverbal human infants make use of intuitive statistics. Here, we conducted the first investigation of such intuitive statistical reasoning with non-human primates. In a series of 7 experiments, Bonobos, Chimpanzees, Gorillas and Orangutans drew flexible statistical inferences from populations to samples. These inferences, furthermore, were truly based on statistical information regarding the relative frequency distributions in a population, and not on absolute frequencies. Intuitive statistics in its most basic form is thus an evolutionarily more ancient rather than a uniquely human capacity. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Galaxy Redshifts from Discrete Optimization of Correlation Functions

    NASA Astrophysics Data System (ADS)

    Lee, Benjamin C. G.; Budavári, Tamás; Basu, Amitabh; Rahman, Mubdi

    2016-12-01

    We propose a new method of constraining the redshifts of individual extragalactic sources based on celestial coordinates and their ensemble statistics. Techniques from integer linear programming (ILP) are utilized to optimize simultaneously for the angular two-point cross- and autocorrelation functions. Our novel formalism introduced here not only transforms the otherwise hopelessly expensive, brute-force combinatorial search into a linear system with integer constraints but also is readily implementable in off-the-shelf solvers. We adopt Gurobi, a commercial optimization solver, and use Python to build the cost function dynamically. The preliminary results on simulated data show potential for future applications to sky surveys by complementing and enhancing photometric redshift estimators. Our approach is the first application of ILP to astronomical analysis.

  10. Parametric inference for biological sequence analysis.

    PubMed

    Pachter, Lior; Sturmfels, Bernd

    2004-11-16

    One of the major successes in computational biology has been the unification, by using the graphical model formalism, of a multitude of algorithms for annotating and comparing biological sequences. Graphical models that have been applied to these problems include hidden Markov models for annotation, tree models for phylogenetics, and pair hidden Markov models for alignment. A single algorithm, the sum-product algorithm, solves many of the inference problems that are associated with different statistical models. This article introduces the polytope propagation algorithm for computing the Newton polytope of an observation from a graphical model. This algorithm is a geometric version of the sum-product algorithm and is used to analyze the parametric behavior of maximum a posteriori inference calculations for graphical models.

  11. Animal Social Network Theory Can Help Wildlife Conservation.

    PubMed

    Snijders, Lysanne; Blumstein, Daniel T; Stanley, Christina R; Franks, Daniel W

    2017-08-01

    Many animals preferentially associate with certain other individuals. This social structuring can influence how populations respond to changes to their environment, thus making network analysis a promising technique for understanding, predicting, and potentially manipulating population dynamics. Various network statistics can correlate with individual fitness components and key population-level processes, yet the logical role and formal application of animal social network theory for conservation and management have not been well articulated. We outline how understanding of direct and indirect relationships between animals can be profitably applied by wildlife managers and conservationists. By doing so, we aim to stimulate the development and implementation of practical tools for wildlife conservation and management and to inspire novel behavioral research in this field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Control mechanisms for stochastic biochemical systems via computation of reachable sets.

    PubMed

    Lakatos, Eszter; Stumpf, Michael P H

    2017-08-01

    Controlling the behaviour of cells by rationally guiding molecular processes is an overarching aim of much of synthetic biology. Molecular processes, however, are notoriously noisy and frequently nonlinear. We present an approach to studying the impact of control measures on motifs of molecular interactions that addresses the problems faced in many biological systems: stochasticity, parameter uncertainty and nonlinearity. We show that our reachability analysis formalism can describe the potential behaviour of biological (naturally evolved as well as engineered) systems, and provides a set of bounds on their dynamics at the level of population statistics: for example, we can obtain the possible ranges of means and variances of mRNA and protein expression levels, even in the presence of uncertainty about model parameters.

  13. Control mechanisms for stochastic biochemical systems via computation of reachable sets

    PubMed Central

    Lakatos, Eszter

    2017-01-01

    Controlling the behaviour of cells by rationally guiding molecular processes is an overarching aim of much of synthetic biology. Molecular processes, however, are notoriously noisy and frequently nonlinear. We present an approach to studying the impact of control measures on motifs of molecular interactions that addresses the problems faced in many biological systems: stochasticity, parameter uncertainty and nonlinearity. We show that our reachability analysis formalism can describe the potential behaviour of biological (naturally evolved as well as engineered) systems, and provides a set of bounds on their dynamics at the level of population statistics: for example, we can obtain the possible ranges of means and variances of mRNA and protein expression levels, even in the presence of uncertainty about model parameters. PMID:28878957

  14. ANALYSIS OF SEEING-INDUCED POLARIZATION CROSS-TALK AND MODULATION SCHEME PERFORMANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casini, R.; De Wijn, A. G.; Judge, P. G.

    2012-09-20

    We analyze the generation of polarization cross-talk in Stokes polarimeters by atmospheric seeing, and its effects on the noise statistics of spectropolarimetric measurements for both single-beam and dual-beam instruments. We investigate the time evolution of seeing-induced correlations between different states of one modulation cycle and compare the response to these correlations of two popular polarization modulation schemes in a dual-beam system. Extension of the formalism to encompass an arbitrary number of modulation cycles enables us to compare our results with earlier work. Even though we discuss examples pertinent to solar physics, the general treatment of the subject and its fundamentalmore » results might be useful to a wider community.« less

  15. Error tolerance analysis of wave diagnostic based on coherent modulation imaging in high power laser system

    NASA Astrophysics Data System (ADS)

    Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang

    2018-02-01

    Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, J.D.; Woan, G.

    Data from the Laser Interferometer Space Antenna (LISA) is expected to be dominated by frequency noise from its lasers. However, the noise from any one laser appears more than once in the data and there are combinations of the data that are insensitive to this noise. These combinations, called time delay interferometry (TDI) variables, have received careful study and point the way to how LISA data analysis may be performed. Here we approach the problem from the direction of statistical inference, and show that these variables are a direct consequence of a principal component analysis of the problem. We presentmore » a formal analysis for a simple LISA model and show that there are eigenvectors of the noise covariance matrix that do not depend on laser frequency noise. Importantly, these orthogonal basis vectors correspond to linear combinations of TDI variables. As a result we show that the likelihood function for source parameters using LISA data can be based on TDI combinations of the data without loss of information.« less

  17. Integration of expert knowledge and uncertainty in natural risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko; Jaboyedoff, Michel

    2010-05-01

    Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.

  18. Influence of formal maternal education on the use of maternity services in Enugu, Nigeria.

    PubMed

    Ikeako, L C; Onah, H E; Iloabachie, G C

    2006-01-01

    Although some previous studies have suggested formal maternal education as the most potent tool for reducing the mortality ratio in Nigeria, other studies found that the depressed Nigerian economy since 1986 has marginalised the benefits of education with the result that educated women stopped making use of existing health facilities because they could not afford the cost of health services. This study was carried out to determine the current influence of formal maternal education and other factors on the choice of place of delivery by pregnant women in Enugu, south-eastern Nigeria. It was a pre-tested interviewer-administered questionnaire study of women who delivered within 3 months before the date of data collection in the study area. In an increasing order of level of care, the outcome variable (place where the last delivery took place) was categorised into seven, with home deliveries representing the lowest category and private hospitals run by specialist obstetricians as the highest category. These were further sub-categorised into non-institutional deliveries and institutional deliveries. Maternal educational level was the main predictor variable. Other predictor variables were sociodemographic factors. Data analysis was by means of descriptive and inferential statistics including means, frequencies and chi2-tests at the 95% confidence (CI) level. Out of a total of 1,450 women to whom the questionnaires were administered, 1,095 women responded (a response rate of 75.5%). A total of 579 (52.9%) of the respondents delivered outside health institutions, while the remaining 516 (47.1%) delivered within health institutions. Regarding the educational levels of the respondents, 301 (27.5%) had no formal education; 410 (37.4%) had primary education; 148 (13.5%) secondary education and 236 (21.5%) post-secondary education. There was a significant positive correlation between the educational levels of the respondents and their husbands (r=0.86, p=0.000). With respect to occupational categories of the respondents, 88 (8.0%) of them belonged to occupational class I, 158 (14.4%) to occupational class II, 107 (9.8%) to occupational class III, 14 (1.3%) to occupational class IV and 728 to occupational class V. There was a significant positive correlation between the respondents' and their husbands' occupational levels (r=0.89, p=0.000). There were statistically significant associations between choice of institutional or non-institutional deliveries and respondents' educational level as well as place of residence (urban/rural), religion, tribe, marital status, occupational level, husband's occupational and educational levels, age and parity (p

  19. Medical statistics and hospital medicine: the case of the smallpox vaccination.

    PubMed

    Rusnock, Andrea

    2007-01-01

    Between 1799 and 1806, trials of vaccination to determine its safety and efficacy were undertaken in hospitals in London, Paris, Vienna, and Boston. These trials were among the first instances of formal hospital evaluations of a medical procedure and signal a growing acceptance of a relatively new approach to medical practice. These early evaluations of smallpox vaccination also relied on descriptive and quantitative accounts, as well as probabilistic analyses, and thus occupy a significant, yet hitherto unexamined, place in the history of medical statistics.

  20. The space of ultrametric phylogenetic trees.

    PubMed

    Gavryushkin, Alex; Drummond, Alexei J

    2016-08-21

    The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Formal reasoning about systems biology using theorem proving

    PubMed Central

    Hasan, Osman; Siddique, Umair; Tahar, Sofiène

    2017-01-01

    System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950

  2. Evaluation of consent for peer physical examination: students reflect on their clinical skills learning experience.

    PubMed

    Wearn, Andy; Bhoopatkar, Harsh

    2006-10-01

    Early clinical skills teaching often requires students to learn through examining one another. This model should acknowledge ethical, practical and individual issues, disclosure and identification of abnormalities. Consent to peer physical examination (PPE) is usually expected rather than discussed and sought. We sought to evaluate a formal written consent process for PPE and to explore students' views of this approach. A survey tool was designed and distributed to all years 2 and 3 students in the Auckland University medical programme (2004). Results were analysed using univariate statistics and thematic analysis. The response rate was 57% (146/258). Most students had read the participant information sheet prior to signing, with 78% giving consent. They had not felt coerced and the in-course experience matched the 'promise'. Comments included: PPE gave insights into the 'patient's world', encouraged peer learning and raised some professional issues. More than 95% of students took the examination role at least once (less likely if female, P = 0.002). Some European, Maori and Pacific students never took the role; all Asian students did at least once. Students preferred PPE in groups consisting of 'friends'. The task influenced group composition by sex (P < 0.0001) but not ethnicity. Students accept and support a formal consent process. PPE participation rates are similar to predictions. The experience must match the promises made. Formal preparation alone might have produced similar student outcomes. Female students are more selective about tasks undertaken. The influence of ethnicity and the effect on future behaviour and attitudes needs further exploration.

  3. The Uphill Battle of Performing Education Scholarship: Barriers Educators and Education Researchers Face.

    PubMed

    Jordan, Jaime; Coates, Wendy C; Clarke, Samuel; Runde, Daniel; Fowlkes, Emilie; Kurth, Jaqueline; Yarris, Lalena

    2018-05-01

    Educators and education researchers report that their scholarship is limited by lack of time, funding, mentorship, expertise, and reward. This study aims to evaluate these groups' perceptions regarding barriers to scholarship and potential strategies for success. Core emergency medicine (EM) educators and education researchers completed an online survey consisting of multiple-choice, 10-point Likert scale, and free-response items in 2015. Descriptive statistics were reported. We used qualitative analysis applying a thematic approach to free-response items. A total of 204 educators and 42 education researchers participated. Education researchers were highly productive: 19/42 reported more than 20 peer-reviewed education scholarship publications on their curricula vitae. In contrast, 68/197 educators reported no education publications within five years. Only a minority, 61/197 had formal research training compared to 25/42 education researchers. Barriers to performing research for both groups were lack of time, competing demands, lack of support, lack of funding, and challenges achieving scientifically rigorous methods and publication. The most common motivators identified were dissemination of knowledge, support of evidence-based practices, and promotion. Respondents advised those who seek greater education research involvement to pursue mentorship, formal research training, collaboration, and rigorous methodological standards. The most commonly cited barriers were lack of time and competing demands. Stakeholders were motivated by the desire to disseminate knowledge, support evidence-based practices, and achieve promotion. Suggested strategies for success included formal training, mentorship, and collaboration. This information may inform interventions to support educators in their scholarly pursuits and improve the overall quality of education research in EM.

  4. Occupational injuries identified by an emergency department based injury surveillance system in Nicaragua

    PubMed Central

    Noe, R; Rocha, J; Clavel-Arcas, C; Aleman, C; Gonzales, M; Mock, C

    2004-01-01

    Objectives: To identify and describe the work related injuries in both the formal and informal work sectors captured in an emergency department based injury surveillance system in Managua, Nicaragua. Setting: Urban emergency department in Managua, Nicaragua serving 200–300 patients per day. Methods: Secondary analysis from the surveillance system data. All cases indicating an injury while working and seen for treatment at the emergency department between 1 August 2001 and 31 July 2002 were included. There was no exclusion based on place of occurrence (home, work, school), age, or gender. Results: There were 3801 work related injuries identified which accounted for 18.6% of the total 20 425 injures captured by the surveillance system. Twenty seven work related fatalities were recorded, compared with the 1998 International Labor Organization statistic of 25 occupational fatalities for all of Nicaragua. Injuries occurring outside of a formal work location accounted for more than 60% of the work related injuries. Almost half of these occurred at home, while 19% occurred on the street. The leading mechanisms for work related injuries were falls (30%), blunt objects (28%), and stabs/cuts (23%). Falls were by far the most severe mechanism in the study, causing 37% of the work related deaths and more than half of the fractures. Conclusions: Occupational injuries are grossly underreported in Nicaragua. This study demonstrated that an emergency department can be a data source for work related injuries in developing countries because it captures both the formal and informal workforce injuries. Fall prevention initiatives could significantly reduce the magnitude and severity of occupational injuries in Managua, Nicaragua. PMID:15314050

  5. [Formal sample size calculation and its limited validity in animal studies of medical basic research].

    PubMed

    Mayer, B; Muche, R

    2013-01-01

    Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.

  6. Collective Bargaining: Its Impact on Educational Cost.

    ERIC Educational Resources Information Center

    Atherton, P. J.

    Since the Ontario (Canada) legislation in 1975 that formalized collective bargaining for teachers, public concern has focused on collective bargaining as the possible cause of recent enrollment declines and increases in schooling costs. However, according to Ontario provincial statistics, enrollment in elementary schools had begun to decline…

  7. Heuristic Elements of Plausible Reasoning.

    ERIC Educational Resources Information Center

    Dudczak, Craig A.

    At least some of the reasoning processes involved in argumentation rely on inferences which do not fit within the traditional categories of inductive or deductive reasoning. The reasoning processes involved in plausibility judgments have neither the formal certainty of deduction nor the imputed statistical probability of induction. When utilizing…

  8. Building Intuitions about Statistical Inference Based on Resampling

    ERIC Educational Resources Information Center

    Watson, Jane; Chance, Beth

    2012-01-01

    Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on…

  9. Confirmatory and Competitive Evaluation of Alternative Gene-Environment Interaction Hypotheses

    ERIC Educational Resources Information Center

    Belsky, Jay; Pluess, Michael; Widaman, Keith F.

    2013-01-01

    Background: Most gene-environment interaction (GXE) research, though based on clear, vulnerability-oriented hypotheses, is carried out using exploratory rather than hypothesis-informed statistical tests, limiting power and making formal evaluation of competing GXE propositions difficult. Method: We present and illustrate a new regression technique…

  10. Love and Sex: Can We Talk About That in School?

    ERIC Educational Resources Information Center

    Vance, Paul C.

    1985-01-01

    Gives statistical information on the "national epidemic" of teenage sexual activity and pregnancy and its consequences. Discusses social causes of this problem. Proposes that schools can help solve the problem by providing a formal sex education curriculum for pupils in kindergarten through grade 12. (CB)

  11. Approaching Bose-Einstein Condensation

    ERIC Educational Resources Information Center

    Ferrari, Loris

    2011-01-01

    Bose-Einstein condensation (BEC) is discussed at the level of an advanced course of statistical thermodynamics, clarifying some formal and physical aspects that are usually not covered by the standard pedagogical literature. The non-conventional approach adopted starts by showing that the continuum limit, in certain cases, cancels out the crucial…

  12. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  13. Probabilistic Graphical Model Representation in Phylogenetics

    PubMed Central

    Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.

    2014-01-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559

  14. Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.

    PubMed

    Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani

    2007-03-01

    Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.

  15. Clinical significance in nursing research: A discussion and descriptive analysis.

    PubMed

    Polit, Denise F

    2017-08-01

    It is widely understood that statistical significance should not be equated with clinical significance, but the topic of clinical significance has not received much attention in the nursing literature. By contrast, interest in conceptualizing and operationalizing clinical significance has been a "hot topic" in other health care fields for several decades. The major purpose of this paper is to briefly describe recent advances in defining and quantifying clinical significance. The overview covers both group-level indicators of clinical significance (e.g., effect size indexes), and individual-level benchmarks (e.g., the minimal important change index). A secondary purpose is to describe the extent to which developments in clinical significance have penetrated the nursing literature. A descriptive analysis of a sample of primary research articles published in three high-impact nursing research journals in 2016 was undertaken. A total of 362 articles were electronically searched for terms relating to statistical and clinical significance. Of the 362 articles, 261 were reports of quantitative studies, the vast majority of which (93%) included a formal evaluation of the statistical significance of the results. By contrast, the term "clinical significance" or related surrogate terms were found in only 33 papers, and most often the term was used informally, without explicit definition or assessment. Raising consciousness about clinical significance should be an important priority among nurse researchers. Several recommendations are offered to improve the visibility and salience of clinical significance in nursing science. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Shared Governance in the Community College: An Analysis of Formal Authority in Collective Bargaining Agreements

    ERIC Educational Resources Information Center

    McDermott, Linda A.

    2012-01-01

    This qualitative study examines shared governance in Washington State's community and technical colleges and provides an analysis of faculty participation in governance based on formal authority in collective bargaining agreements. Contracts from Washington's thirty community and technical college districts were reviewed in order to identify in…

  17. Formalization and Analysis of Reasoning by Assumption

    ERIC Educational Resources Information Center

    Bosse, Tibor; Jonker, Catholijn M.; Treur, Jan

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically…

  18. Precision, Reliability, and Effect Size of Slope Variance in Latent Growth Curve Models: Implications for Statistical Power Analysis

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; Ghisletta, Paolo; Lindenberger, Ulman; Hertzog, Christopher

    2018-01-01

    Latent Growth Curve Models (LGCM) have become a standard technique to model change over time. Prediction and explanation of inter-individual differences in change are major goals in lifespan research. The major determinants of statistical power to detect individual differences in change are the magnitude of true inter-individual differences in linear change (LGCM slope variance), design precision, alpha level, and sample size. Here, we show that design precision can be expressed as the inverse of effective error. Effective error is determined by instrument reliability and the temporal arrangement of measurement occasions. However, it also depends on another central LGCM component, the variance of the latent intercept and its covariance with the latent slope. We derive a new reliability index for LGCM slope variance—effective curve reliability (ECR)—by scaling slope variance against effective error. ECR is interpretable as a standardized effect size index. We demonstrate how effective error, ECR, and statistical power for a likelihood ratio test of zero slope variance formally relate to each other and how they function as indices of statistical power. We also provide a computational approach to derive ECR for arbitrary intercept-slope covariance. With practical use cases, we argue for the complementary utility of the proposed indices of a study's sensitivity to detect slope variance when making a priori longitudinal design decisions or communicating study designs. PMID:29755377

  19. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  20. Improving accuracy and power with transfer learning using a meta-analytic database.

    PubMed

    Schwartz, Yannick; Varoquaux, Gaël; Pallier, Christophe; Pinel, Philippe; Poline, Jean-Baptiste; Thirion, Bertrand

    2012-01-01

    Typical cohorts in brain imaging studies are not large enough for systematic testing of all the information contained in the images. To build testable working hypotheses, investigators thus rely on analysis of previous work, sometimes formalized in a so-called meta-analysis. In brain imaging, this approach underlies the specification of regions of interest (ROIs) that are usually selected on the basis of the coordinates of previously detected effects. In this paper, we propose to use a database of images, rather than coordinates, and frame the problem as transfer learning: learning a discriminant model on a reference task to apply it to a different but related new task. To facilitate statistical analysis of small cohorts, we use a sparse discriminant model that selects predictive voxels on the reference task and thus provides a principled procedure to define ROIs. The benefits of our approach are twofold. First it uses the reference database for prediction, i.e., to provide potential biomarkers in a clinical setting. Second it increases statistical power on the new task. We demonstrate on a set of 18 pairs of functional MRI experimental conditions that our approach gives good prediction. In addition, on a specific transfer situation involving different scanners at different locations, we show that voxel selection based on transfer learning leads to higher detection power on small cohorts.

  1. EFFICIENTLY ESTABLISHING CONCEPTS OF INFERENTIAL STATISTICS AND HYPOTHESIS DECISION MAKING THROUGH CONTEXTUALLY CONTROLLED EQUIVALENCE CLASSES

    PubMed Central

    Fienup, Daniel M; Critchfield, Thomas S

    2010-01-01

    Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904

  2. Testing the statistical compatibility of independent data sets

    NASA Astrophysics Data System (ADS)

    Maltoni, M.; Schwetz, T.

    2003-08-01

    We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed.

  3. What Influences Recommendations Issued by the Agency for Health Technology Assessment in Poland? A Glimpse Into Decision Makers' Preferences.

    PubMed

    Niewada, Maciej; Polkowska, Małgorzata; Jakubczyk, Michał; Golicki, Dominik

    This study aimed to evaluate the factors that are associated with positive (supporting public funding) and negative recommendations of the Agency for Health Technology Assessment in Poland. Two independent analysts reviewed all the recommendations publicly available online before October 7, 2011. For each recommendation, predefined decision rationales, that is, clinical efficacy, safety, cost-effectiveness, and formal aspects, were sought, either advocating or discouraging the public financing. In the analysis, we used descriptive statistics and a logistic regression model so as to identify the association between predefined criteria and the recommendation being positive. We identified 344 recommendations-218 positive (62.8%) and 126 negative (37.2%). Negative recommendations were better justified and also the comments were less ambiguous in accordance with the recommendation (except for clinical efficacy). In general, the specified criteria supported the decision (either positive or negative) in 209 (60.8%), 107 (31.1%), 124 (36.0%), 96 (27.9%), and 61 (17.7%) recommendations, respectively, and ran contrary to the actual decision in the remaining ones. Threshold values for either cost-effectiveness or budget impact distinguishing positive from negative recommendations could not be specified. The following parameters reached statistical significance in logistic regression: clinical efficacy (both explicitly positive and explicitly negative evaluations impacted in opposite directions), lack of impact on hard end points, unfavorable safety profile, cost-effectiveness results, and formal shortcomings (all reduced the probability of a positive recommendation). Decision making of the Agency for Health Technology Assessment in Poland is multicriterial, and its results cannot be easily decomposed into simple associations or easily predicted. Still, efficacy and safety seem to contribute most to final recommendations. Copyright © 2013, International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc.

  4. Testing New Physics with the Cosmic Microwave Background

    NASA Astrophysics Data System (ADS)

    Gluscevic, Vera

    2013-01-01

    In my thesis work, I have developed and applied tests of new fundamental physics that utilize high-precision CMB polarization measurements. I especially focused on a wide class of dark energy models that propose existence of new scalar fields to explain accelerated expansion of the Universe. Such fields naturally exhibit a weak interaction with photons, giving rise to "cosmic birefringence"---a rotation of the polarization plane of light traveling cosmological distances, which alters the statistics of the CMB fluctuations in the sky by inducing a characteristic B-mode polarization. A birefringent rotation of the CMB would be smoking-gun evidence that dark energy is a dynamical component rather than a cosmological constant, while its absence gives clues about the allowed regions of the parameter space for new models. I developed a full-sky formalism to search for cosmic birefringence by cross-correlating CMB temperature and polarization maps, after allowing for the rotation angle to vary across the sky. With my collaborators, I also proposed a cross-correlation of the rotation-angle estimator with the CMB temperature as a novel statistical probe which can boost signal-to-noise in the case of marginal detection and help disentangle the underlying physical models. I then investigated the degeneracy between the rotation signal and the signals from other exotic scenarios that induce a similar B-mode polarization signature, such as chiral primordial gravitational waves, and demonstrated that these effects are completely separable. Finally, I applied this formalism to WMAP-7 data and derived the first CMB constraint on the power spectrum of the birefringent-rotation angle and presented forecasts for future experiments. To demonstrate the value of this analysis method beyond the search for direction-dependent cosmic birefringence, I have also used it to probe patchy screening from the epoch of cosmic reionization with WMAP-7 data.

  5. Extracting Prior Distributions from a Large Dataset of In-Situ Measurements to Support SWOT-based Estimation of River Discharge

    NASA Astrophysics Data System (ADS)

    Hagemann, M.; Gleason, C. J.

    2017-12-01

    The upcoming (2021) Surface Water and Ocean Topography (SWOT) NASA satellite mission aims, in part, to estimate discharge on major rivers worldwide using reach-scale measurements of stream width, slope, and height. Current formalizations of channel and floodplain hydraulics are insufficient to fully constrain this problem mathematically, resulting in an infinitely large solution set for any set of satellite observations. Recent work has reformulated this problem in a Bayesian statistical setting, in which the likelihood distributions derive directly from hydraulic flow-law equations. When coupled with prior distributions on unknown flow-law parameters, this formulation probabilistically constrains the parameter space, and results in a computationally tractable description of discharge. Using a curated dataset of over 200,000 in-situ acoustic Doppler current profiler (ADCP) discharge measurements from over 10,000 USGS gaging stations throughout the United States, we developed empirical prior distributions for flow-law parameters that are not observable by SWOT, but that are required in order to estimate discharge. This analysis quantified prior uncertainties on quantities including cross-sectional area, at-a-station hydraulic geometry width exponent, and discharge variability, that are dependent on SWOT-observable variables including reach-scale statistics of width and height. When compared against discharge estimation approaches that do not use this prior information, the Bayesian approach using ADCP-derived priors demonstrated consistently improved performance across a range of performance metrics. This Bayesian approach formally transfers information from in-situ gaging stations to remote-sensed estimation of discharge, in which the desired quantities are not directly observable. Further investigation using large in-situ datasets is therefore a promising way forward in improving satellite-based estimates of river discharge.

  6. [The workplace-based learning: a main paradigm of an effective continuing medical education].

    PubMed

    Lelli, Maria Barbara

    2010-01-01

    On the strength of the literature analysis and the Emilia-Romagna Region experience, we suggest a reflection on the workplace-based learning that goes beyond the analysis of the effectiveness of specific didactic methodologies and aspects related to Continuing Medical Education. Health education and training issue is viewed from a wider perspective, that integrates the three learning dimensions (formal, non formal and informal). In such a perspective the workplace-based learning becomes an essential paradigm to reshape the explicit knowledge conveyed in formal context and to emphasize informal contexts where innovation is generated.

  7. Beginning Teacher Induction: A Report on Beginning Teacher Effectiveness and Retention.

    ERIC Educational Resources Information Center

    Serpell, Zewelanji; Bozeman, Leslie A.

    National statistics show a rise in the number of beginning teachers undergoing formal induction in their first year of teaching. This report discusses the effectiveness of induction programs and resulting outcomes for beginning teacher retention, beginning teacher effectiveness, and mentor participation. The various components of induction…

  8. Statistical Knowledge and Learning in Phonology

    ERIC Educational Resources Information Center

    Dunbar, Ewan Michael

    2013-01-01

    This dissertation deals with the theory of the phonetic component of grammar in a formal probabilistic inference framework: (1) it has been recognized since the beginning of generative phonology that some language-specific phonetic implementation is actually context-dependent, and thus it can be said that there are gradient "phonetic…

  9. Mathematical Literacy--It's Become Fundamental

    ERIC Educational Resources Information Center

    McCrone, Sharon Soucy; Dossey, John A.

    2007-01-01

    The rising tide of numbers and statistics in daily life signals a need for a fundamental broadening of the concept of literacy: mathematical literacy assuming a coequal role in the curriculum alongside language-based literacy. Mathematical literacy is not about studying higher levels of formal mathematics, but about making math relevant and…

  10. Developing Sensitivity to Subword Combinatorial Orthographic Regularity (SCORe): A Two-Process Framework

    ERIC Educational Resources Information Center

    Mano, Quintino R.

    2016-01-01

    Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…

  11. Prison Clinicians' Perceptions of Antisocial Personality Disorder as a Formal Diagnosis.

    ERIC Educational Resources Information Center

    Stevens, Gail Flint

    1994-01-01

    Surveyed and interviewed 53 clinicians who work with prison inmates. Results indicated that clinicians used diagnosis of antisocial personality disorder liberally among inmates and felt majority of inmates could be so diagnosed. Large minority of clinicians went beyond Diagnostic and Statistical Manual of Mental Disorders criteria and reported…

  12. Ethical Reasoning Instruction in Non-Ethics Business Courses: A Non-Intrusive Approach

    ERIC Educational Resources Information Center

    Wilhelm, William J.

    2010-01-01

    This article discusses four confirmatory studies designed to corroborate findings from prior developmental research which yielded statistically significant improvements in student moral reasoning when specific instructional strategies and content materials were utilized in non-ethics business courses by instructors not formally trained in business…

  13. The Lay Concept of Childhood Mental Disorder

    ERIC Educational Resources Information Center

    Giummarra, Melita J.; Haslam, Nick

    2005-01-01

    The structure of lay people's concepts of childhood mental disorder was investigated in a questionnaire study and examined for convergence with the Diagnostic and Statistical Manual (DSM-IV). Eighty-four undergraduates who had no formal education in abnormal psychology rated 54 conditions--36 DSM-IV childhood disorders and 18 non-disorders--on…

  14. From Mere Coincidences to Meaningful Discoveries

    ERIC Educational Resources Information Center

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2007-01-01

    People's reactions to coincidences are often cited as an illustration of the irrationality of human reasoning about chance. We argue that coincidences may be better understood in terms of rational statistical inference, based on their functional role in processes of causal discovery and theory revision. We present a formal definition of…

  15. Structured Statistical Models of Inductive Reasoning

    ERIC Educational Resources Information Center

    Kemp, Charles; Tenenbaum, Joshua B.

    2009-01-01

    Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet…

  16. Evaluating Teachers and Schools Using Student Growth Models

    ERIC Educational Resources Information Center

    Schafer, William D.; Lissitz, Robert W.; Zhu, Xiaoshu; Zhang, Yuan; Hou, Xiaodong; Li, Ying

    2012-01-01

    Interest in Student Growth Modeling (SGM) and Value Added Modeling (VAM) arises from educators concerned with measuring the effectiveness of teaching and other school activities through changes in student performance as a companion and perhaps even an alternative to status. Several formal statistical models have been proposed for year-to-year…

  17. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    PubMed

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  18. The Alignment of the Informal and Formal Organizational Supports for Reform: Implications for Improving Teaching in Schools

    ERIC Educational Resources Information Center

    Penuel, William R.; Riel, Margaret; Joshi, Aasha; Pearlman, Leslie; Kim, Chong Min; Frank, Kenneth A.

    2010-01-01

    Previous qualitative studies show that when the formal organization of a school and patterns of informal interaction are aligned, faculty and leaders in a school are better able to coordinate instructional change. This article combines social network analysis with interview data to analyze how well the formal and informal aspects of a school's…

  19. English Language Education in Formal and Cram School Contexts: An Analysis of Listening Strategy and Learning Style

    ERIC Educational Resources Information Center

    Chou, Mu-hsuan

    2017-01-01

    Formal English language education in Taiwan now starts at Year 3 in primary school, with an emphasis on communicative proficiency. In addition to formal education, attending English cram schools after regular school has become a common phenomenon for Taiwanese students. The main purpose of gaining additional reinforcement in English cram schools…

  20. (Finite) statistical size effects on compressive strength.

    PubMed

    Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien

    2014-04-29

    The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.

  1. Rich analysis and rational models: Inferring individual behavior from infant looking data

    PubMed Central

    Piantadosi, Steven T.; Kidd, Celeste; Aslin, Richard

    2013-01-01

    Studies of infant looking times over the past 50 years have provided profound insights about cognitive development, but their dependent measures and analytic techniques are quite limited. In the context of infants' attention to discrete sequential events, we show how a Bayesian data analysis approach can be combined with a rational cognitive model to create a rich data analysis framework for infant looking times. We formalize (i) a statistical learning model (ii) a parametric linking between the learning model's beliefs and infants' looking behavior, and (iii) a data analysis model that infers parameters of the cognitive model and linking function for groups and individuals. Using this approach, we show that recent findings from Kidd, Piantadosi, and Aslin (2012) of a U-shaped relationship between look-away probability and stimulus complexity even holds within infants and is not due to averaging subjects with different types of behavior. Our results indicate that individual infants prefer stimuli of intermediate complexity, reserving attention for events that are moderately predictable given their probabilistic expectations about the world. PMID:24750256

  2. Rich analysis and rational models: inferring individual behavior from infant looking data.

    PubMed

    Piantadosi, Steven T; Kidd, Celeste; Aslin, Richard

    2014-05-01

    Studies of infant looking times over the past 50 years have provided profound insights about cognitive development, but their dependent measures and analytic techniques are quite limited. In the context of infants' attention to discrete sequential events, we show how a Bayesian data analysis approach can be combined with a rational cognitive model to create a rich data analysis framework for infant looking times. We formalize (i) a statistical learning model, (ii) a parametric linking between the learning model's beliefs and infants' looking behavior, and (iii) a data analysis approach and model that infers parameters of the cognitive model and linking function for groups and individuals. Using this approach, we show that recent findings from Kidd, Piantadosi and Aslin (iv) of a U-shaped relationship between look-away probability and stimulus complexity even holds within infants and is not due to averaging subjects with different types of behavior. Our results indicate that individual infants prefer stimuli of intermediate complexity, reserving attention for events that are moderately predictable given their probabilistic expectations about the world. © 2014 John Wiley & Sons Ltd.

  3. Generalized Linear Covariance Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  4. Why Engineers Should Consider Formal Methods

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    1997-01-01

    This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.

  5. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  6. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  7. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  8. An assessment of the validity and discrimination of the intensive time-series design by monitoring learning differences between students with different cognitive tendencies

    NASA Astrophysics Data System (ADS)

    Farnsworth, Carolyn H.; Mayer, Victor J.

    Intensive time-series designs for classroom investigations have been under development since 1975. Studies have been conducted to determine their feasibility (Mayer & Lewis, 1979), their potential for monitoring knowledge acquisition (Mayer & Kozlow, 1980), and the potential threat to validity of the frequency of testing inherent in the design (Mayer & Rojas, 1982). This study, an extension of those previous studies, is an attempt to determine the degree of discrimination the design allows in collecting data on achievement. It also serves as a replication of the Mayer and Kozlow study, an attempt to determine design validity for collecting achievement data. The investigator used her eighth-grade earth science students, from a suburban Columbus (Ohio) junior high school. A multiple-group single intervention time-series design (Glass, Willson, & Gottman, 1975) was adapted to the collection of daily data on achievement in the topic of the intervention, a unit on plate tectonics. Single multiple-choice items were randomly assigned to each of three groups of students, identified on the basis of their ranking on a written test of cognitive level (Lawson, 1978). The top third, or those with formal cognitive tendencies, were compared on the basis of knowledge achievement and understanding achievement with the lowest third of the students, or those with concrete cognitive tendencies, to determine if the data collected in the design would discriminate between the two groups. Several studies (Goodstein & Howe, 1978; Lawson & Renner, 1975) indicated that students with formal cognitive tendencies should learn a formal concept such as plate tectonics with greater understanding than should students with concrete cognitive tendencies. Analyses used were a comparison of regression lines in each of the three study stages: baseline, intervention, and follow-up; t-tests of means of days summed across each stage; and a time-series analysis program. Statistically significant differences were found between the two groups both in slopes of regression lines (0.0001) and in t-tests (0.0005) on both knowledge and understanding levels of learning. These differences confirm the discrimination of the intensive time-series design in showing that it can distinguish differences in learning between students with formal cognitive tendencies and those with concrete cognitive tendencies. The time-series analysis model with a trend in the intervention was better than a model with no trend for both groups of students, in that it accounted for a greater amount of variance in the data from both knowledge and understanding levels of learning. This finding adds additional confidence in the validity of the design for obtaining achievement data. When the analysis model with trend was used on data from the group with formal cognitive tendencies, it accounted for a greater degree of variance than the same model applied to the data from the group with concrete cognitive tendencies. This more conservative analysis, therefor, gave results consistent with those from the more usual linear regression techniques and t-tests, further adding to the confidence in the discrimination of the design.

  9. Introduction of formal debate into a postgraduate specialty track education programme in periodontics in Japan.

    PubMed

    Saito, A; Fujinami, K

    2011-02-01

    To evaluate the formal debate as an active learning strategy within a postgraduate specialty track education programme in periodontics. A formal debate was implemented as an active learning strategy in the programme. The participants were full-time faculty, residents and dentists attending special courses at a teaching hospital in Japan. They were grouped into two evenly matched opposing teams, judges and audience. As a preparation for the debate, the participants attended a lecture on critical thinking. At the time of debate, each team provided a theme report with a list of references. Performances and contents of the debate were evaluated by the course instructors and audience. Pre- and post-debate testing was used to assess the participants' objective knowledge on clinical periodontology. Evaluation of the debate by the participants revealed that scores for criteria, such as presentation performance, response with logic and rebuttal effectiveness were relatively low. Thirty-eight per cent of the participants demonstrated higher test scores after the debate, although there was no statistically significant difference in the mean scores between pre- and post-tests. At the end of the debate, vast majority of participants recognised the significance and importance of the formal debate in the programme. It was suggested that the incorporation of the formal debate could serve as an educational tool for the postgraduate specialty track programme. © 2011 John Wiley & Sons A/S.

  10. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    NASA Technical Reports Server (NTRS)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less

  12. Approximate Micromechanics Treatise of Composite Impact

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Handler, Louis M.

    2005-01-01

    A formalism is described for micromechanic impact of composites. The formalism consists of numerous equations which describe all aspects of impact from impactor and composite conditions to impact contact, damage progression, and penetration or containment. The formalism is based on through-the-thickness displacement increments simulation which makes it convenient to track local damage in terms of microfailure modes and their respective characteristics. A flow chart is provided to cast the formalism (numerous equations) into a computer code for embedment in composite mechanic codes and/or finite element composite structural analysis.

  13. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  14. A quantitative approach to evolution of music and philosophy

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano

    2012-08-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

  15. Statistical quantifiers of memory for an analysis of human brain and neuro-system diseases

    NASA Astrophysics Data System (ADS)

    Demin, S. A.; Yulmetyev, R. M.; Panischev, O. Yu.; Hänggi, Peter

    2008-03-01

    On the basis of a memory function formalism for correlation functions of time series we investigate statistical memory effects by the use of appropriate spectral and relaxation parameters of measured stochastic data for neuro-system diseases. In particular, we study the dynamics of the walk of a patient who suffers from Parkinson's disease (PD), Huntington's disease (HD), amyotrophic lateral sclerosis (ALS), and compare against the data of healthy people (CO - control group). We employ an analytical method which is able to characterize the stochastic properties of stride-to-stride variations of gait cycle timing. Our results allow us to estimate quantitatively a few human locomotion function abnormalities occurring in the human brain and in the central nervous system (CNS). Particularly, the patient's gait dynamics are characterized by an increased memory behavior together with sizable fluctuations as compared with the locomotion dynamics of healthy patients. Moreover, we complement our findings with peculiar features as detected in phase-space portraits and spectral characteristics for the different data sets (PD, HD, ALS and healthy people). The evaluation of statistical quantifiers of the memory function is shown to provide a useful toolkit which can be put to work to identify various abnormalities of locomotion dynamics. Moreover, it allows one to diagnose qualitatively and quantitatively serious brain and central nervous system diseases.

  16. On the analysis of studies of choice

    PubMed Central

    Mullins, Eamonn; Agunwamba, Christian C.; Donohoe, Anthony J.

    1982-01-01

    In a review of 103 sets of data from 23 different studies of choice, Baum (1979) concluded that whereas undermatching was most commonly observed for responses, the time measure generally conformed to the matching relation. A reexamination of the evidence presented by Baum concludes that undermatching is the most commonly observed finding for both measures. Use of the coefficient of determination by both Baum (1979) and de Villiers (1977) for assessing when matching occurs is criticized on statistical grounds. An alternative to the loss-in-predictability criterion used by Baum (1979) is proposed. This alternative statistic has a simple operational meaning and is related to the usual F-ratio test. It can therefore be used as a formal test of the hypothesis that matching occurs. Baum (1979) also suggests that slope values of between .90 and 1.11 can be considered good approximations to matching. It is argued that the establishment of a fixed interval as a criterion for determining when matching occurs, is inappropriate. A confidence interval based on the data from any given experiment is suggested as a more useful method of assessment. PMID:16812271

  17. Mathematical problems in the application of multilinear models to facial emotion processing experiments

    NASA Astrophysics Data System (ADS)

    Andersen, Anders H.; Rayens, William S.; Li, Ren-Cang; Blonder, Lee X.

    2000-10-01

    In this paper we describe the enormous potential that multilinear models hold for the analysis of data from neuroimaging experiments that rely on functional magnetic resonance imaging (MRI) or other imaging modalities. A case is made for why one might fully expect that the successful introduction of these models to the neuroscience community could define the next generation of structure-seeking paradigms in the area. In spite of the potential for immediate application, there is much to do from the perspective of statistical science. That is, although multilinear models have already been particularly successful in chemistry and psychology, relatively little is known about their statistical properties. To that end, our research group at the University of Kentucky has made significant progress. In particular, we are in the process of developing formal influence measures for multilinear methods as well as associated classification models and effective implementations. We believe that these problems will be among the most important and useful to the scientific community. Details are presented herein and an application is given in the context of facial emotion processing experiments.

  18. Path Integrals for Electronic Densities, Reactivity Indices, and Localization Functions in Quantum Systems

    PubMed Central

    Putz, Mihai V.

    2009-01-01

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467

  19. Path integrals for electronic densities, reactivity indices, and localization functions in quantum systems.

    PubMed

    Putz, Mihai V

    2009-11-10

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.

  20. Current tobacco smoking, formal education, and the risk of rheumatoid arthritis.

    PubMed

    Uhlig, T; Hagen, K B; Kvien, T K

    1999-01-01

    To identify if tobacco smoking or sociodemographic characteristics are risk factors of rheumatoid arthritis (RA). From a county RA register 361 patients in the age range 20-79 years were recruited from incidence cohorts with recent disease onset (mean 3.4 years) and compared with 5851 randomly selected individuals from the same population area. Data on selected risk factors were collected by questionnaires (response rate 75 and 59%, respectively) and associations with smoking and risk factors were expressed as odds ratios (OR) with 95% confidence intervals (CI) in a multiple regression analysis. Age and female sex were, as expected, identified as risk factors of RA. In addition, current smoking was an overall risk factor (OR 1.46, 95% CI 1.10-1.94), in men (OR 2.38, 95% CI 1.45-3.92), especially in men with seropositive RA (OR 4.77, 95% CI 2.09-10.90). Separate analyses revealed no statistically significant risk in women (OR 1.14, 95% CI 0.80-1.62). Low level of formal education, body mass index, marital or employment status were not significantly associated with risk of RA. Current smoking in men was identified as an independent risk factor for RA, whereas surrogate markers of socioeconomic status were unrelated to the onset of RA.

  1. Learning Competences in Open Mobile Environments: A Comparative Analysis between Formal and Non-Formal Spaces

    ERIC Educational Resources Information Center

    Figaredo, Daniel Domínguez; Miravalles, Paz Trillo

    2014-01-01

    As a result of the increasing use of mobile devices in education, new approaches to define the learning competences in the field of digitally mediated learning have emerged. This paper examines these approaches, using data obtained from empirical research with a group of Spanish university students. The analysis is focused on the experiences of…

  2. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  3. Learning in non-formal education: Is it "youthful" for youth in action?

    NASA Astrophysics Data System (ADS)

    Norqvist, Lars; Leffler, Eva

    2017-04-01

    This article offers insights into the practices of a non-formal education programme for youth provided by the European Union (EU). It takes a qualitative approach and is based on a case study of the European Voluntary Service (EVS). Data were collected during individual and focus group interviews with learners (the EVS volunteers), decision takers and trainers, with the aim of deriving an understanding of learning in non-formal education. The research questions concerned learning, the recognition of learning and perspectives of usefulness. The study also examined the Youthpass documentation tool as a key to understanding the recognition of learning and to determine whether the learning was useful for learners (the volunteers). The findings and analysis offer several interpretations of learning, and the recognition of learning, which take place in non-formal education. The findings also revealed that it is complicated to divide learning into formal and non- formal categories; instead, non-formal education is useful for individual learners when both formal and non-formal educational contexts are integrated. As a consequence, the division of formal and non-formal (and possibly even informal) learning creates a gap which works against the development of flexible and interconnected education with ubiquitous learning and mobility within and across formal and non-formal education. This development is not in the best interests of learners, especially when seeking useful learning and education for youth (what the authors term "youthful" for youth in action).

  4. Exoplanet Biosignatures: Future Directions

    PubMed Central

    Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y.; Lenardic, Adrian; Reinhard, Christopher T.; Moore, William; Schwieterman, Edward W.; Shkolnik, Evgenya L.; Smith, Harrison B.

    2018-01-01

    Abstract We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets—Biosignatures—Life detection—Bayesian analysis. Astrobiology 18, 779–824. PMID:29938538

  5. Linkage analysis of chromosome 22q12-13 in a United Kingdom/Icelandic sample of 23 multiplex schizophrenia families

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalsi, G.; Read, T.; Butler, R.

    A possible linkage to a genetic subtype of schizophrenia and related disorders has been reported on the long arm of chromosome 22 at q12-13. However formal statistical tests in a combined sample could not reject homogeneity and prove that there was linked subgroup of families. We have studied 23 schizophrenia pedigrees to test whether some multiplex schizophrenia families may be linked to the microsatellite markers D22S274 and D22S283 which span the 22q12-13 region. Two point followed by multipoint lod and non-parametric linkage analyses under the assumption of heterogeneity provided no evidence for linkage over the relevant region. 16 refs., 4more » tabs.« less

  6. On the theory of Carriers's Electrostatic Interaction near an Interface

    NASA Astrophysics Data System (ADS)

    Waters, Michael; Hashemi, Hossein; Kieffer, John

    2015-03-01

    Heterojunction interfaces are common in non-traditional photovoltaic device designs, such as those based small molecules, polymers, and perovskites. We have examined a number of the effects of the heterojunction interface region on carrier/exciton energetics using a mixture of both semi-classical and quantum electrostatic methods, ab initio methods, and statistical mechanics. Our theoretical analysis has yielded several useful relationships and numerical recipes that should be considered in device design regardless of the particular materials system. As a demonstration, we highlight these formalisms as applied to carriers and polaron pairs near a C60/subphthalocyanine interface. On the regularly ordered areas of the heterojunction, the effect of the interface is a significant set of corrections to the carrier energies, which in turn directly affects device performance.

  7. Population forecasts for Bangladesh, using a Bayesian methodology.

    PubMed

    Mahsin, Md; Hossain, Syed Shahadat

    2012-12-01

    Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.

  8. Exoplanet Biosignatures: Future Directions.

    PubMed

    Walker, Sara I; Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y; Lenardic, Adrian; Reinhard, Christopher T; Moore, William; Schwieterman, Edward W; Shkolnik, Evgenya L; Smith, Harrison B

    2018-06-01

    We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets-Biosignatures-Life detection-Bayesian analysis. Astrobiology 18, 779-824.

  9. Statistical manifestation of quantum correlations via disequilibrium

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2017-12-01

    The statistical notion of disequilibrium (D) was introduced by López-Ruiz, Mancini, and Calbet (LMC) (1995) [1] more than 20 years ago. D measures the amount of ;correlational structure; of a system. We wish to use D to analyze one of the simplest types of quantum correlations, those present in gaseous systems due to symmetry considerations. To this end we extend the LMC formalism to the grand canonical environment and show that D displays distinctive behaviors for simple gases, that allow for interesting insights into their structural properties.

  10. Bayesian Decision Support

    NASA Astrophysics Data System (ADS)

    Berliner, M.

    2017-12-01

    Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.

  11. Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes

    NASA Technical Reports Server (NTRS)

    Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.

    2000-01-01

    Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.

  12. Statistical nature of infrared dynamics on de Sitter background

    NASA Astrophysics Data System (ADS)

    Tokuda, Junsei; Tanaka, Takahiro

    2018-02-01

    In this study, we formulate a systematic way of deriving an effective equation of motion(EoM) for long wavelength modes of a massless scalar field with a general potential V(phi) on de Sitter background, and investigate whether or not the effective EoM can be described as a classical stochastic process. Our formulation gives an extension of the usual stochastic formalism to including sub-leading secular growth coming from the nonlinearity of short wavelength modes. Applying our formalism to λ phi4 theory, we explicitly derive an effective EoM which correctly recovers the next-to-leading secularly growing part at a late time, and show that this effective EoM can be seen as a classical stochastic process. Our extended stochastic formalism can describe all secularly growing terms which appear in all correlation functions with a specific operator ordering. The restriction of the operator ordering will not be a big drawback because the commutator of a light scalar field becomes negligible at large scales owing to the squeezing.

  13. Evaluating the Effectiveness of a Large-Scale Professional Development Programme

    ERIC Educational Resources Information Center

    Main, Katherine; Pendergast, Donna

    2017-01-01

    An evaluation of the effectiveness of a large-scale professional development (PD) programme delivered to 258 schools in Queensland, Australia is presented. Formal evaluations were conducted at two stages during the programme using a tool developed from Desimone's five core features of effective PD. Descriptive statistics of 38 questions and…

  14. Making Heads or Tails of Probability: An Experiment with Random Generators

    ERIC Educational Resources Information Center

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  15. Emergent Feature Structures: Harmony Systems in Exemplar Models of Phonology

    ERIC Educational Resources Information Center

    Cole, Jennifer

    2009-01-01

    In exemplar models of phonology, phonotactic constraints are modeled as emergent from patterns of high activation between units that co-occur with statistical regularity, or as patterns of low activation or inhibition between units that co-occur less frequently or not at all. Exemplar models posit no a "priori" formal or representational…

  16. On Testability of Missing Data Mechanisms in Incomplete Data Sets

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2011-01-01

    This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…

  17. Integrated Postsecondary Education Data System Data Quality Study. Methodology Report. NCES 2005-175

    ERIC Educational Resources Information Center

    Jackson, Kenneth W.; Peecksen, Scott; Jang, Donsig; Sukasih, Amang

    2005-01-01

    The Integrated Postsecondary Education Data System (IPEDS) of the National Center for Education Statistics (NCES) was initiated in 1986 to collect data about all identified institutions whose primary purpose is to provide postsecondary education. Postsecondary education is defined within IPEDS as "the provision of a formal instructional…

  18. Bootstrapping in a Language of Thought: A Formal Model of Numerical Concept Learning

    ERIC Educational Resources Information Center

    Piantadosi, Steven T.; Tenenbaum, Joshua B.; Goodman, Noah D.

    2012-01-01

    In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful…

  19. Behavioral and Social Science Research: A National Resource. Part II.

    ERIC Educational Resources Information Center

    Adams, Robert McC., Ed.; And Others

    Areas of behavioral and social science research that have achieved significant breakthroughs in knowledge or application or that show future promise of achieving such breakthroughs are discussed in 12 papers. For example, the paper on formal demography shows how mathematical or statistical techniques can be used to explain and predict change in…

  20. Students and Courses 2002: At a Glance. Australian Vocational Education and Training Statistics.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    The public vocational education and training (VET) system in Australia encompasses formal learning activities intended to develop knowledge and skills that are relevant in the workplace for those past the age of compulsory schooling, but excludes bachelor and post-graduate courses and learning for leisure, recreation or personal enrichment. Some…

  1. Mechanics of Brittle Materials. Part 1. Preliminary Mechanical Properties and Statistical Representations

    DTIC Science & Technology

    1973-10-01

    intensity computation are shown in Figure 17. Using the same formal procedure outlined by Winne & Wundt . a notch geometry can be chosen to induce...Nitride at Elevated Temperatures . Winne, D.H. and Wundt , B.M., "Application of the Gnffith-Irwm Theory of Crack Propagation to the Bursting Behavior

  2. Beyond Literacy: Non-Formal Education Programmes for Adults in Mozambique

    ERIC Educational Resources Information Center

    van der Linden, Josje; Manuel, Alzira Munguambe

    2011-01-01

    Thirty-five years after independence the Mozambican illiteracy rate has been reduced from 93% to just over 50% according to official statistics. Although this indicates an enormous achievement in the area of education, the challenge of today still is to design appropriate adult basic education programmes including literacy, numeracy and life…

  3. 77 FR 72715 - Informal Entry Limit and Removal of a Formal Entry Requirement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... required certifications, enforcement information, and statistical data. An agency may not conduct or..., 1623, 1624, 3314. * * * * * Sec. 10.1 [Amended] 0 2. In Sec. 10.1: 0 a. Paragraph (a) introductory text... revising``19------'' to read ``20---- --''; 0 c. Paragraph (a)(2) introductory text is amended in the last...

  4. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  5. Random dopant fluctuations and statistical variability in n-channel junctionless FETs

    NASA Astrophysics Data System (ADS)

    Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.

    2018-01-01

    The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.

  6. How do physicians learn to provide palliative care?

    PubMed

    Schulman-Green, Dena

    2003-01-01

    Medical interns, residents, and fellows are heavily involved in caring for dying patients and interacting with their families. Due to a lack of formal medical education in the area, these house staff often have a limited knowledge of palliative care. The purpose of this study was to determine how, given inadequate formal education, house staff learn to provide palliative care. Specifically, this study sought to explore the extent to which physicians learn to provide palliative care through formal medical education, from physicians and other hospital staff, and by on-the-job learning. Twenty physicians were interviewed about their medical education and other learning experiences in palliative care. ATLAS/ti software was used for data coding and analysis. Analysis of transcripts indicated that house staff learn little to nothing through formal education, to varying degrees from attending physicians and hospital staff, and mostly on the job and by making mistakes.

  7. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  8. Statistical mechanics of complex neural systems and high dimensional data

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-03-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

  9. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    PubMed

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  10. Assessment of change in conservation attitudes through zoo education

    NASA Astrophysics Data System (ADS)

    Randall, Teresa

    2011-12-01

    This study was conducted at the Oklahoma City Zoo in fall 2010 and subjects were students' ages 14-18 who either participated in a formal conservation education class led by zoo educators or in a field trip in which they were engaged in free-choice learning. Two research questions were: 1) Does a trip to the zoo affect conservation attitudes and 2) does learning experience, free-choice or formal, affect conservation attitudes? A criterion group design was used and the instrument used to measure conservation attitudes was Tool 4 from the Visitor Evaluation Toolbox produced by the Association of Zoos and Aquariums MIRP study (Falk, J., Bronnenkant, K., Vernon, C., & Heimlich, J., 2009). Group one (N=110) engaged in a free-choice (field trip only) experience and group two (N=367) engaged in a formal conservation education class. The survey was administered retrospectively to both groups upon completion of their learning experience at the zoo. Statistical analysis was conducted using SPSS 17.0. A paired sample t-test showed the overall mean within both groups increased in a positive direction from 67.965 (retrospective) to 72.345 (present). With alpha set at .05 the two-tailed probability was <0.001, therefore confirming that the change in conservation attitudes was significant. An independent sample t-test of the change in scores between the groups produced p values of 0.792 and 0.773 and revealed that the change was not significant. Findings did illustrate that a trip to the zoo did positively and significantly affect conservation attitudes among teens and that the type of learning experience did not significantly affect change in conservation attitude scores.

  11. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  12. The inner formal structure of the H-T-P drawings: an exploratory study.

    PubMed

    Vass, Z

    1998-08-01

    The study describes some interrelated patterns of traits of the House-Tree-Person (H-T-P) drawings with the instruments of hierarchical cluster analysis. First, according to the literature 1 7 formal or structural aspects of the projective drawings were collected, after which a detailed manual for coding was compiled. Second, the interrater reliability and the consistency of this manual was tested. Third, the hierarchical cluster structure of the reliable and consistent formal aspects was analysed. Results are: (a) a psychometrically tested coding manual of the investigated formal-structural aspects, each of them illustrated with drawings that showed the highest interrater agreement; and (b) the hierarchic cluster structure of the formal aspects of the H-T-P drawings of "normal" adults.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunes, Rafael C.; Abreu, Everton M.C.; Neto, Jorge Ananias

    Based on the relationship between thermodynamics and gravity we propose, with the aid of Verlinde's formalism, an alternative interpretation of the dynamical evolution of the Friedmann-Robertson-Walker Universe. This description takes into account the entropy and temperature intrinsic to the horizon of the universe due to the information holographically stored there through non-gaussian statistical theories proposed by Tsallis and Kaniadakis. The effect of these non-gaussian statistics in the cosmological context is to change the strength of the gravitational constant. In this paper, we consider the w CDM model modified by the non-gaussian statistics and investigate the compatibility of these non-gaussian modificationmore » with the cosmological observations. In order to analyze in which extend the cosmological data constrain these non-extensive statistics, we will use type Ia supernovae, baryon acoustic oscillations, Hubble expansion rate function and the linear growth of matter density perturbations data. We show that Tsallis' statistics is favored at 1σ confidence level.« less

  14. Data-based Non-Markovian Model Inference

    NASA Astrophysics Data System (ADS)

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close collaboration with M.D. Chekroun, D. Kondrashov, S. Kravtsov and A.W. Robertson.

  15. Uncertainty and inference in the world of paleoecological data

    NASA Astrophysics Data System (ADS)

    McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.

    2017-12-01

    Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a coarseness in statistical models of taphonomic process. Each of these features provides useful opportunities for statisticians and data-generating researchers to assess what we know about the signal and the noise in paleo data and to improve inference about past changes in ecosystem state.

  16. A review of geographic variation and Geographic Information Systems (GIS) applications in prescription drug use research.

    PubMed

    Wangia, Victoria; Shireman, Theresa I

    2013-01-01

    While understanding geography's role in healthcare has been an area of research for over 40 years, the application of geography-based analyses to prescription medication use is limited. The body of literature was reviewed to assess the current state of such studies to demonstrate the scale and scope of projects in order to highlight potential research opportunities. To review systematically how researchers have applied geography-based analyses to medication use data. Empiric, English language research articles were identified through PubMed and bibliographies. Original research articles were independently reviewed as to the medications or classes studied, data sources, measures of medication exposure, geographic units of analysis, geospatial measures, and statistical approaches. From 145 publications matching key search terms, forty publications met the inclusion criteria. Cardiovascular and psychotropic classes accounted for the largest proportion of studies. Prescription drug claims were the primary source, and medication exposure was frequently captured as period prevalence. Medication exposure was documented across a variety of geopolitical units such as countries, provinces, regions, states, and postal codes. Most results were descriptive and formal statistical modeling capitalizing on geospatial techniques was rare. Despite the extensive research on small area variation analysis in healthcare, there are a limited number of studies that have examined geographic variation in medication use. Clearly, there is opportunity to collaborate with geographers and GIS professionals to harness the power of GIS technologies and to strengthen future medication studies by applying more robust geospatial statistical methods. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey

    PubMed Central

    Vasconcelos, Hemerson Bruno da Silva; Woods, David John

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. Methods: A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Results: Pharmacists had 1–4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). Conclusion: These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools. PMID:29272292

  18. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey.

    PubMed

    Néri, Eugenie Desirèe Rabelo; Meira, Assuero Silva; Vasconcelos, Hemerson Bruno da Silva; Woods, David John; Fonteles, Marta Maria de França

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Pharmacists had 1-4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools.

  19. MTS dye based colorimetric CTLL-2 cell proliferation assay for product release and stability monitoring of interleukin-15: assay qualification, standardization and statistical analysis.

    PubMed

    Soman, Gopalan; Yang, Xiaoyi; Jiang, Hengguang; Giardina, Steve; Vyas, Vinay; Mitra, George; Yovandich, Jason; Creekmore, Stephen P; Waldmann, Thomas A; Quiñones, Octavio; Alvord, W Gregory

    2009-08-31

    A colorimetric cell proliferation assay using soluble tetrazolium salt [(CellTiter 96(R) Aqueous One Solution) cell proliferation reagent, containing the (3-(4,5-dimethylthiazol-2-yl)-5-(3-carboxymethoxyphenyl)-2-(4-sulfophenyl)-2H-tetrazolium, inner salt) and an electron coupling reagent phenazine ethosulfate], was optimized and qualified for quantitative determination of IL-15 dependent CTLL-2 cell proliferation activity. An in-house recombinant Human (rHu)IL-15 reference lot was standardized (IU/mg) against an international reference standard. Specificity of the assay for IL-15 was documented by illustrating the ability of neutralizing anti-IL-15 antibodies to block the product specific CTLL-2 cell proliferation and the lack of blocking effect with anti-IL-2 antibodies. Under the defined assay conditions, the linear dose-response concentration range was between 0.04 and 0.17ng/ml of the rHuIL-15 produced in-house and 0.5-3.0IU/ml for the international standard. Statistical analysis of the data was performed with the use of scripts written in the R Statistical Language and Environment utilizing a four-parameter logistic regression fit analysis procedure. The overall variation in the ED(50) values for the in-house reference standard from 55 independent estimates performed over the period of 1year was 12.3% of the average. Excellent intra-plate and within-day/inter-plate consistency was observed for all four parameter estimates in the model. Different preparations of rHuIL-15 showed excellent intra-plate consistency in the parameter estimates corresponding to the lower and upper asymptotes as well as to the 'slope' factor at the mid-point. The ED(50) values showed statistically significant differences for different lots and for control versus stressed samples. Three R-scripts improve data analysis capabilities allowing one to describe assay variations, to draw inferences between data sets from formal statistical tests, and to set up improved assay acceptance criteria based on comparability and consistency in the four parameters of the model. The assay is precise, accurate and robust and can be fully validated. Applications of the assay were established including process development support, release of the rHuIL-15 product for pre-clinical and clinical studies, and for monitoring storage stability.

  20. Optimism bias leads to inconclusive results - an empirical study

    PubMed Central

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T.; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J.

    2010-01-01

    Objective Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully, and explored whether poor accrual or optimism bias is responsible for inconclusive results. Study Design Systematic review Setting Retrospective analysis of a consecutive series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Results 359 trials (374 comparisons) enrolling 150,232 patients were analyzed. 70% (262/374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273/374) of studies. Investigators’ judgments and statistical inferences were concordant in 75% (279/374) of trials. Investigators consistently overestimated their expected treatment effects, but to a significantly larger extent for inconclusive trials. The median ratio of expected over observed hazard ratio or odds ratio was 1.34 (range 0.19 – 15.40) in conclusive trials compared to 1.86 (range 1.09 – 12.00) in inconclusive studies (p<0.0001). Only 17% of the trials had treatment effects that matched original researchers’ expectations. Conclusion Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. PMID:21163620

  1. Optimism bias leads to inconclusive results-an empirical study.

    PubMed

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J

    2011-06-01

    Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully and explored whether poor accrual or optimism bias is responsible for inconclusive results. Systematic review. Retrospective analysis of a consecutive-series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Three hundred fifty-nine trials (374 comparisons) enrolling 150,232 patients were analyzed. Seventy percent (262 of 374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273 of 374) of studies. Investigators' judgments and statistical inferences were concordant in 75% (279 of 374) of trials. Investigators consistently overestimated their expected treatment effects but to a significantly larger extent for inconclusive trials. The median ratio of expected and observed hazard ratio or odds ratio was 1.34 (range: 0.19-15.40) in conclusive trials compared with 1.86 (range: 1.09-12.00) in inconclusive studies (P<0.0001). Only 17% of the trials had treatment effects that matched original researchers' expectations. Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Multivariate assessment of event-related potentials with the t-CWT method.

    PubMed

    Bostanov, Vladimir

    2015-11-05

    Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.

  3. Hunting high and low: disentangling primordial and late-time non-Gaussianity with cosmic densities in spheres

    NASA Astrophysics Data System (ADS)

    Uhlemann, C.; Pajer, E.; Pichon, C.; Nishimichi, T.; Codis, S.; Bernardeau, F.

    2018-03-01

    Non-Gaussianities of dynamical origin are disentangled from primordial ones using the formalism of large deviation statistics with spherical collapse dynamics. This is achieved by relying on accurate analytical predictions for the one-point probability distribution function and the two-point clustering of spherically averaged cosmic densities (sphere bias). Sphere bias extends the idea of halo bias to intermediate density environments and voids as underdense regions. In the presence of primordial non-Gaussianity, sphere bias displays a strong scale dependence relevant for both high- and low-density regions, which is predicted analytically. The statistics of densities in spheres are built to model primordial non-Gaussianity via an initial skewness with a scale dependence that depends on the bispectrum of the underlying model. The analytical formulas with the measured non-linear dark matter variance as input are successfully tested against numerical simulations. For local non-Gaussianity with a range from fNL = -100 to +100, they are found to agree within 2 per cent or better for densities ρ ∈ [0.5, 3] in spheres of radius 15 Mpc h-1 down to z = 0.35. The validity of the large deviation statistics formalism is thereby established for all observationally relevant local-type departures from perfectly Gaussian initial conditions. The corresponding estimators for the amplitude of the non-linear variance σ8 and primordial skewness fNL are validated using a fiducial joint maximum likelihood experiment. The influence of observational effects and the prospects for a future detection of primordial non-Gaussianity from joint one- and two-point densities-in-spheres statistics are discussed.

  4. Developing a Non-Formal Education and Literacy Database in the Asia-Pacific Region. Final Report of the Expert Group Consultation Meeting (Dhaka, Bangladesh, December 15-18, 1997).

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and the Pacific.

    The objectives of the Expert Group Consultation Meeting for Developing a Non-Formal Education and Literacy Database in the Asia-Pacific Region were: to exchange information and review the state-of-the-art in the field of data collection, analysis and indicators of non-formal education and literacy programs; to examine and review the set of…

  5. What Can History Teach Us A Comparative Historical Analysis On the Reserve Officer Training Corps and the Department of Homeland Security

    DTIC Science & Technology

    2015-12-01

    professional development aspirations. An organization that realized a very similar narrative as the DHS is the Department of Defense (DOD), more...is one that finds itself imbedded in several debates surrounding the development of formalized education/preparatory efforts for its core civilian... development of formalized education efforts for its workforce. There is formalized preparatory training for several different kinds of homeland security

  6. Influence Strategy: Principles and Levels of Analysis

    DTIC Science & Technology

    2011-12-01

    expended its own. The United States formally entered the war on in December 1941 following the Japanese surprise attack at Pearl Harbor. Less formally...placed in key positions and the Reich Cinema Law (RLG) introduced as a means to exercise further control. For instance, the RLG required all film...Western Europe by Germany. However for this purpose it will not be counted until the formal declaration of war in 1941. Following the Japanese

  7. Statistical mechanics of shell models for two-dimensional turbulence

    NASA Astrophysics Data System (ADS)

    Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.

    1994-12-01

    We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.

  8. Number statistics for β-ensembles of random matrices: Applications to trapped fermions at zero temperature.

    PubMed

    Marino, Ricardo; Majumdar, Satya N; Schehr, Grégory; Vivo, Pierpaolo

    2016-09-01

    Let P_{β}^{(V)}(N_{I}) be the probability that a N×Nβ-ensemble of random matrices with confining potential V(x) has N_{I} eigenvalues inside an interval I=[a,b] on the real line. We introduce a general formalism, based on the Coulomb gas technique and the resolvent method, to compute analytically P_{β}^{(V)}(N_{I}) for large N. We show that this probability scales for large N as P_{β}^{(V)}(N_{I})≈exp[-βN^{2}ψ^{(V)}(N_{I}/N)], where β is the Dyson index of the ensemble. The rate function ψ^{(V)}(k_{I}), independent of β, is computed in terms of single integrals that can be easily evaluated numerically. The general formalism is then applied to the classical β-Gaussian (I=[-L,L]), β-Wishart (I=[1,L]), and β-Cauchy (I=[-L,L]) ensembles. Expanding the rate function around its minimum, we find that generically the number variance var(N_{I}) exhibits a nonmonotonic behavior as a function of the size of the interval, with a maximum that can be precisely characterized. These analytical results, corroborated by numerical simulations, provide the full counting statistics of many systems where random matrix models apply. In particular, we present results for the full counting statistics of zero-temperature one-dimensional spinless fermions in a harmonic trap.

  9. On the simulation of indistinguishable fermions in the many-body Wigner formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.

    2015-01-01

    The simulation of quantum systems consisting of interacting, indistinguishable fermions is an incredible mathematical problem which poses formidable numerical challenges. Many sophisticated methods addressing this problem are available which are based on the many-body Schrödinger formalism. Recently a Monte Carlo technique for the resolution of the many-body Wigner equation has been introduced and successfully applied to the simulation of distinguishable, spinless particles. This numerical approach presents several advantages over other methods. Indeed, it is based on an intuitive formalism in which quantum systems are described in terms of a quasi-distribution function, and highly scalable due to its Monte Carlo nature.more » In this work, we extend the many-body Wigner Monte Carlo method to the simulation of indistinguishable fermions. To this end, we first show how fermions are incorporated into the Wigner formalism. Then we demonstrate that the Pauli exclusion principle is intrinsic to the formalism. As a matter of fact, a numerical simulation of two strongly interacting fermions (electrons) is performed which clearly shows the appearance of a Fermi (or exchange–correlation) hole in the phase-space, a clear signature of the presence of the Pauli principle. To conclude, we simulate 4, 8 and 16 non-interacting fermions, isolated in a closed box, and show that, as the number of fermions increases, we gradually recover the Fermi–Dirac statistics, a clear proof of the reliability of our proposed method for the treatment of indistinguishable particles.« less

  10. Dietary diversity of formal and informal residents in Johannesburg, South Africa

    PubMed Central

    2013-01-01

    Background This paper considers the question of dietary diversity as a proxy for nutrition insecurity in communities living in the inner city and the urban informal periphery in Johannesburg. It argues that the issue of nutrition insecurity demands urgent and immediate attention by policy makers. Methods A cross-sectional survey was undertaken for households from urban informal (n = 195) and urban formal (n = 292) areas in Johannesburg, South Africa. Foods consumed by the respondents the previous day were used to calculate a Dietary Diversity Score; a score < 4 was considered low. Results Statistical comparisons of means between groups revealed that respondents from informal settlements consumed mostly cereals and meat/poultry/fish, while respondents in formal settlements consumed a more varied diet. Significantly more respondents living in informal settlements consumed a diet of low diversity (68.1%) versus those in formal settlements (15.4%). When grouped in quintiles, two-thirds of respondents from informal settlements fell in the lowest two, versus 15.4% living in formal settlements. Households who experienced periods of food shortages during the previous 12 months had a lower mean DDS than those from food secure households (4.00 ± 1.6 versus 4.36 ± 1.7; p = 0.026). Conclusions Respondents in the informal settlements were more nutritionally vulnerable. Achieving nutrition security requires policies, strategies and plans to include specific nutrition considerations. PMID:24088249

  11. Evidence-based surgery: knowledge, attitudes, and perceived barriers among surgical trainees.

    PubMed

    Mittal, Rohin; Perakath, Benjamin

    2010-01-01

    This study was conducted to assess the knowledge and attitude of surgical trainees toward evidence-based medicine (EBM) and their perceived barriers to its practice. The McColl questionnaire and the BARRIERS scale were modified and incorporated into a single questionnaire, which was administered to all surgical trainees attending a Continuing Surgical Education meeting. Department of Surgery, Christian Medical College, Vellore, India. One hundred ten surgical trainees from 22 medical colleges. In all, 84.5% (93/110) trainees returned the questionnaire. The attitudes toward EBM were welcoming, although individual participants reported they welcomed EBM more than their colleagues did. Participants agreed that EBM was useful in everyday practice and that it improved patient care. About 50% of actual practice was considered evidence based. In all, 12.6% (10/89) of participants had received formal training in EBM, and 64.3% (54/84) of participants were aware of the Cochrane database of systemic reviews, but only 35.7% (30/84) read it regularly. Also, 67.8% (61/90) of respondents used protocols and guidelines developed by colleagues. However, 61.5% (56/91) of participants were interested in learning the skills of EBM. The terms absolute risk, relative risk, and clinical effectiveness were understood by >80% of respondents, whereas publication bias, confidence interval, and heterogeneity were poorly understood. The major barriers to practice of EBM were the inability to understand statistical analysis, inadequate facilities for implementation, lack of a single compiled source of literature, relevant literature not being readily available, and insufficient time on the job. Surgical trainees have a positive attitude towards EBM and have some familiarity with the common terms used in EBM. There is a need to increase awareness of, and provide access to, available sources of medical literature. Formal training in EBM, as well as basic statistical analysis, should form a part of the surgical curriculum to foster an environment favorable to the practice of EBM. Copyright © 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  12. INVESTIGATING DIFFERENCES IN BRAIN FUNCTIONAL NETWORKS USING HIERARCHICAL COVARIATE-ADJUSTED INDEPENDENT COMPONENT ANALYSIS.

    PubMed

    Shi, Ran; Guo, Ying

    2016-12-01

    Human brains perform tasks via complex functional networks consisting of separated brain regions. A popular approach to characterize brain functional networks in fMRI studies is independent component analysis (ICA), which is a powerful method to reconstruct latent source signals from their linear mixtures. In many fMRI studies, an important goal is to investigate how brain functional networks change according to specific clinical and demographic variabilities. Existing ICA methods, however, cannot directly incorporate covariate effects in ICA decomposition. Heuristic post-ICA analysis to address this need can be inaccurate and inefficient. In this paper, we propose a hierarchical covariate-adjusted ICA (hc-ICA) model that provides a formal statistical framework for estimating covariate effects and testing differences between brain functional networks. Our method provides a more reliable and powerful statistical tool for evaluating group differences in brain functional networks while appropriately controlling for potential confounding factors. We present an analytically tractable EM algorithm to obtain maximum likelihood estimates of our model. We also develop a subspace-based approximate EM that runs significantly faster while retaining high accuracy. To test the differences in functional networks, we introduce a voxel-wise approximate inference procedure which eliminates the need of computationally expensive covariance matrix estimation and inversion. We demonstrate the advantages of our methods over the existing method via simulation studies. We apply our method to an fMRI study to investigate differences in brain functional networks associated with post-traumatic stress disorder (PTSD).

  13. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  14. Steroid-antivirals treatment versus steroids alone for the treatment of Bell's palsy: a meta-analysis.

    PubMed

    Dong, Yabing; Zhu, Yong; Ma, Chuan; Zhao, Huaqiang

    2015-01-01

    To illustrate whether the steroid-antivirals treatment could acquire a better recovery in patients with Bell's palsy than the steroids alone treatment. We conducted an exhaustive search over Pub med/Medline, Ovid, Elsevier search engines and the Cochrane library thereby collecting the randomized controlled trials in the treatment of patients with Bell's palsy with steroid-antivirals and steroids. The qualities of relevant articles were assessed by GRADE, which was used to present the overall quality of evidence as recommended by the Cochrane Handbook for Systematic Reviews of Interventions. Two investigators evaluated these papers independently, and resolved the disagreements by discussion. At last 8 eligible papers (1816 patients included: 896 treated with steroid-antivirals and 920 treated with steroids alone) match the criteria. Owing to the result (chi(2) = 12.57, P = 0.08, I(2) = 44%) presented by the formal test for heterogeneity, the fixed effect meta-analysis model was chosen. The facial muscle recovery between the steroids-antivirals group and the steroids alone group show significant differences (OR = 1.52, 95% CI: 1.20-1.94), while the statistical outcome of adverse effect shows no statistical significance (OR = 1.28, 95% CI: 0.71-2.31). The present meta-analysis indicates that the steroid-antivirals treatment could improve the recovery rate in patients with Bell's palsy when comparing with the steroid alone treatment. This meta-analysis showed that the steroid-antivirals treatment achieved the better outcomes in patients with Bell's palsy. Clinicians should consider that steroid-antivirals therapy is an alternative choice for the patients with Bell's palsy.

  15. Big data to smart data in Alzheimer's disease: Real-world examples of advanced modeling and simulation.

    PubMed

    Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo

    2016-09-01

    Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Precarious employment in Chile: psychometric properties of the Chilean version of Employment Precariousness Scale in private sector workers.

    PubMed

    Vives-Vergara, Alejandra; González-López, Francisca; Solar, Orielle; Bernales-Baksai, Pamela; González, María José; Benach, Joan

    2017-04-20

    The purpose of this study is to perform a psychometric analysis (acceptability, reliability and factor structure) of the Chilean version of the new Employment Precariousness Scale (EPRES). The data is drawn from a sample of 4,248 private salaried workers with a formal contract from the first Chilean Employment Conditions, Work, Health and Quality of Life (ENETS) survey, applied to a nationally representative sample of the Chilean workforce in 2010. Item and scale-level statistics were performed to assess scaling properties, acceptability and reliability. The six-dimensional factor structure was examined with confirmatory factor analysis. The scale exhibited high acceptability (roughly 80%) and reliability (Cronbach's alpha 0.83) and the factor structure was confirmed. One subscale (rights) demonstrated poorer metric properties without compromising the overall scale. The Chilean version of the Employment Precariousness Scale (EPRES-Ch) demonstrated good metric properties, pointing to its suitability for use in epidemiologic and public health research.

  17. Breast milk donation and social support: reports of women donors.

    PubMed

    De Alencar, Lucienne Christine Estevez; Seidl, Eliane Maria Fleury

    2010-01-01

    The study aimed to characterize the behavior of human milk donation and to describe the informal social and formal institutional support, according to reports from women donors. It is an exploratory, cross-sectional, descriptive study using domicile interviews based on structured and semi-structured scripts. The participants were 36 women enrolled in two human milk banks of the public health system of the Federal District. Statistical analysis of quantitative data and categorical content analysis of qualitative data were performed. Categories of reasons that most influenced the frequency of expressing were: food, time availability, negative emotions and fluid intake. The manual expressing technique was reported as predominant. The use of breast shells was cited by almost a third of the donors. Most frequent suggestions for improving institutional support were more attention and support from the milk banks for the donor. The study may serve as a stimulus for the implementation of technical and political strategies to encourage this practice.

  18. Early-childhood housing mobility and subsequent PTSD in adolescence: a Moving to Opportunity reanalysis.

    PubMed

    Norris, David C; Wilson, Andrew

    2016-01-01

    In a 2014 report on adolescent mental health outcomes in the Moving to Opportunity for Fair Housing Demonstration (MTO), Kessler et al. reported that, at 10- to 15-year follow-up, boys from households randomized to an experimental housing voucher intervention experienced 12-month prevalence of post-traumatic stress disorder (PTSD) at several times the rate of boys from control households. We reanalyze this finding here, bringing to light a PTSD outcome imputation procedure used in the original analysis, but not described in the study report. By bootstrapping with repeated draws from the frequentist sampling distribution of the imputation model used by Kessler et al., and by varying two pseudorandom number generator seeds that fed their analysis, we account for several purely statistical components of the uncertainty inherent in their imputation procedure. We also discuss other sources of uncertainty in this procedure that were not accessible to a formal reanalysis.

  19. Analyzing thresholds and efficiency with hierarchical Bayesian logistic regression.

    PubMed

    Houpt, Joseph W; Bittner, Jennifer L

    2018-07-01

    Ideal observer analysis is a fundamental tool used widely in vision science for analyzing the efficiency with which a cognitive or perceptual system uses available information. The performance of an ideal observer provides a formal measure of the amount of information in a given experiment. The ratio of human to ideal performance is then used to compute efficiency, a construct that can be directly compared across experimental conditions while controlling for the differences due to the stimuli and/or task specific demands. In previous research using ideal observer analysis, the effects of varying experimental conditions on efficiency have been tested using ANOVAs and pairwise comparisons. In this work, we present a model that combines Bayesian estimates of psychometric functions with hierarchical logistic regression for inference about both unadjusted human performance metrics and efficiencies. Our approach improves upon the existing methods by constraining the statistical analysis using a standard model connecting stimulus intensity to human observer accuracy and by accounting for variability in the estimates of human and ideal observer performance scores. This allows for both individual and group level inferences. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. On the Correct Analysis of the Foundations of Theoretical Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2007-04-01

    The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.

  1. Dalitz plot analysis of the D+→K-π+π+ decay in the FOCUS experiment

    NASA Astrophysics Data System (ADS)

    Link, J. M.; Yager, P. M.; Anjos, J. C.; Bediaga, I.; Castromonte, C.; Machado, A. A.; Magnin, J.; Massafferri, A.; de Miranda, J. M.; Pepe, I. M.; Polycarpo, E.; Dos Reis, A. C.; Carrillo, S.; Casimiro, E.; Cuautle, E.; Sánchez-Hernández, A.; Uribe, C.; Vázquez, F.; Agostino, L.; Cinquini, L.; Cumalat, J. P.; Frisullo, V.; O'Reilly, B.; Segoni, I.; Stenson, K.; Butler, J. N.; Cheung, H. W. K.; Chiodini, G.; Gaines, I.; Garbincius, P. H.; Garren, L. A.; Gottschalk, E.; Kasper, P. H.; Kreymer, A. E.; Kutschke, R.; Wang, M.; Benussi, L.; Bianco, S.; Fabbri, F. L.; Zallo, A.; Reyes, M.; Cawlfield, C.; Kim, D. Y.; Rahimi, A.; Wiss, J.; Gardner, R.; Kryemadhi, A.; Chung, Y. S.; Kang, J. S.; Ko, B. R.; Kwak, J. W.; Lee, K. B.; Cho, K.; Park, H.; Alimonti, G.; Barberis, S.; Boschini, M.; Cerutti, A.; D'Angelo, P.; Dicorato, M.; Dini, P.; Edera, L.; Erba, S.; Inzani, P.; Leveraro, F.; Malvezzi, S.; Menasce, D.; Mezzadri, M.; Moroni, L.; Pedrini, D.; Pontoglio, C.; Prelz, F.; Rovere, M.; Sala, S.; Davenport, T. F.; Arena, V.; Boca, G.; Bonomi, G.; Gianini, G.; Liguori, G.; Lopes Pegna, D.; Merlo, M. M.; Pantea, D.; Ratti, S. P.; Riccardi, C.; Vitulo, P.; Göbel, C.; Otalora, J.; Hernandez, H.; Lopez, A. M.; Mendez, H.; Paris, A.; Quinones, J.; Ramirez, J. E.; Zhang, Y.; Wilson, J. R.; Handler, T.; Mitchell, R.; Engh, D.; Hosack, M.; Johns, W. E.; Luiggi, E.; Nehring, M.; Sheldon, P. D.; Vaandering, E. W.; Webster, M.; Sheaff, M.; Pennington, M. R.; Focus Collaboration

    2007-09-01

    Using data collected by the high-energy photoproduction experiment FOCUS at Fermilab we performed a Dalitz plot analysis of the Cabibbo favored decay D+ →K-π+π+. This study uses 53653 Dalitz-plot events with a signal fraction of ∼ 97%, and represents the highest statistics, most complete Dalitz plot analysis for this channel. Results are presented and discussed using two different formalisms. The first is a simple sum of Breit-Wigner functions with freely fitted masses and widths. It is the model traditionally adopted and serves as comparison with the already published analyses. The second uses a K-matrix approach for the dominant S-wave, in which the parameters are fixed by first fitting Kπ scattering data and continued to threshold by Chiral Perturbation Theory. We show that the Dalitz plot distribution for this decay is consistent with the assumption of two-body dominance of the final state interactions and the description of these interactions is in agreement with other data on the Kπ final state.

  2. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  3. Is the Training of Imaging Informatics Personnel in New Zealand Adequate?

    PubMed

    Hughes, Kevin; Poletti, John L

    2016-12-01

    The purpose of this study of Imaging Informatics Professionals (IIPs) in New Zealand was to assess their experience, background, educational qualifications and needs for support and continuing education. The IIP role includes administration of DICOM modalities, picture archiving and communication systems (PACS), radiology information systems (RIS) and many additional software and hardware systems, including the interface to New Zealand's nationwide individual electronic medical records (EMR) system. Despite the complexity of current systems, training programmes for IIPs are almost non-existent in Australasia. This cross-sectional qualitative case study used triangulated data sources, via online questionnaire, interview and critical incident analysis. Demographic data was also obtained from the questionnaire. Participants included about one third of the IIPs in New Zealand. Quantitative results were summarised with descriptive statistics or frequency data. Qualitative data was assessed by iterative multi-staged thematic analysis. This study found that the IIP role is undertaken by personnel from diverse backgrounds. Most of the IIPs learned what they know from vendors and on the job. Many feel that their biggest issue is in not knowing what they do not know and therefore not having sufficient understanding of the imaging informatics field. Only one IIP had any formal certification in PACS administration. Most respondents indicated their desire for some form of additional training. The number of IIPs in New Zealand healthcare is very small, so neither a formal training programme nor regulatory body is viable or justified. However, IIPs believe there is a need for education, regulation and recognition that their role is a critical component in healthcare.

  4. Exploring High School Students Beginning Reasoning about Significance Tests with Technology

    ERIC Educational Resources Information Center

    García, Víctor N.; Sánchez, Ernesto

    2017-01-01

    In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…

  5. An Introduction to Distributions Using Weighted Dice

    ERIC Educational Resources Information Center

    Holland, Bart K.

    2011-01-01

    Distributions are the basis for an enormous amount of theoretical and applied work in statistics. While there are formal definitions of distributions and many formulas to characterize them, it is important that students at first get a clear introduction to this basic concept. For many of them, neither words nor formulas can match the power of a…

  6. Uncertainty in eddy covariance measurements and its application to physiological models

    Treesearch

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  7. A Statistical Ontology-Based Approach to Ranking for Multiword Search

    ERIC Educational Resources Information Center

    Kim, Jinwoo

    2013-01-01

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…

  8. Reliability Considerations for the Operation of Large Accelerator User Facilities

    DOE PAGES

    Willeke, F. J.

    2016-01-29

    The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. Finally, the article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.

  9. A Formal Derivation of the Gibbs Entropy for Classical Systems Following the Schrodinger Quantum Mechanical Approach

    ERIC Educational Resources Information Center

    Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.

    2008-01-01

    In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…

  10. Open Educational Resources: A Faculty Author's Perspective

    ERIC Educational Resources Information Center

    Illowsky, Barbara

    2012-01-01

    As the coauthor (with Susan Dean) of a formally for-profit and now open (i.e., free on the web) textbook, "Collaborative Statistics," this author has received many questions about open educational resources (OER), which can be summarized as follows: (1) What are OER?; (2) Why do you support, actively promote, and speak about OER?; (3) If a book is…

  11. 76 FR 30306 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-25

    ...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee on June 14-15, 2011 to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.

  12. 76 FR 43266 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee, on August 9-10, 2011, to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.

  13. Standing by Their Principles: Two Librarians Who Faced Challenges

    ERIC Educational Resources Information Center

    Adams, Helen; Leu, DaNae; Venuto, Dee Ann

    2015-01-01

    What do school librarians fear most? Hands down, their biggest fear is a formal challenge to a resource in the school library. There are no accurate statistics about the number of challenges to school library resources. The staff of ALA's Office for Intellectual Freedom estimates that only about 20 percent are reported to ALA annually. For the…

  14. Exploring the Implementation, Effectiveness and Costs of the Reading Partners Program

    ERIC Educational Resources Information Center

    Jacob, Robin; Elson, Dean; Bowden, Brooks; Armstrong, Catherine

    2015-01-01

    Reading skills are the key building blocks of a child's formal education. Yet, the national statistics on literacy attainment are profoundly distressing: two out of three American fourth graders are reading below grade level and almost one third of children nationwide lack even basic reading skills. This study reports on an evaluation of the…

  15. 76 FR 66875 - Informal Entry Limit and Removal of a Formal Entry Requirement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-28

    ... to properly assess duties on the merchandise and collect accurate statistics with respect to the.... In Sec. 10.1: a. Introductory paragraph (a) is amended by removing the word ``shall'' and adding in... removing the word ``shall'' and adding in its place the word ``must''; m. Introductory paragraph (h)(4) is...

  16. NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.

    2008-01-01

    This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation under all possible scenarios. Current research has assumed perfect knowledge of the location of other aircraft in the vicinity so absolute guarantees are possible, but increasingly we are relaxing the assumptions to allow incomplete, inaccurate, and/or faulty information from communication sources.

  17. "Do You Want Me to Translate This in English or in a Better Mandinka Language?": Unequal Literacy Regimes and Grassroots Spelling Practices in Peri-Urban Gambia

    ERIC Educational Resources Information Center

    Juffermans, Kasper

    2011-01-01

    This paper presents a comparative ethnographic analysis of two versions of a grassroots text in Mandinka language, one written by a non-formally educated man, the other a respelling by a formally educated urbanite. The analysis points at a crucial difference in spelling practices and inequality in literacy regimes, i.e., between established…

  18. Participation in Non-Formal Learning in EU-15 and EU-8 Countries: Demand and Supply Side Factors

    ERIC Educational Resources Information Center

    Roosmaa, Eve-Liis; Saar, Ellu

    2012-01-01

    The main purpose of this paper is to provide an in-depth analysis of participation in non-formal learning in different European Union member states. The paper also seeks to extend analysis of the training gap by pursuing the distinction between the supply and the demand for skills. We use aggregate data from the Adult Education Survey (Eurostat)…

  19. Quantum-Like Models for Decision Making in Psychology and Cognitive Science

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2009-02-01

    We show that (in contrast to rather common opinion) the domain of applications of the mathematical formalism of quantum mechanics is not restricted to physics. This formalism can be applied to the description of various quantum-like (QL) information processing. In particular, the calculus of quantum (and more general QL) probabilities can be used to explain some paradoxical statistical data which was collected in psychology and cognitive science. The main lesson of our study is that one should sharply distinguish the mathematical apparatus of QM from QM as a physical theory. The domain of application of the mathematical apparatus is essentially wider than quantum physics. Quantum-like representation algorithm, formula of total probability, interference of probabilities, psychology, cognition, decision making.

  20. 41 CFR 109-1.5204 - Review and approval of a designated contractor's personal property management system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... overhaul; and (2) An analysis of the cost to implement the overhaul within a year versus a proposed... be based on a formal comprehensive appraisal or a series of formal appraisals of the functional...

  1. International Workshop on Principles of Program Analysis

    DTIC Science & Technology

    1999-01-01

    with respect to a semantics of the programming language. It is a sad fact that new program analyses often contain subtle bugs, and a formal ... It defines a higher-order function f with formal parameter x and body x 1; then it defines two functions g and h that are given as actual parameters...begin by presenting a formal semantics for WHILE. The material of this section may be skimmed through on a first reading; however, it is frequently

  2. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  3. [Discussion between informal and formal caregivers of community-dwelling older adults].

    PubMed

    Jacobs, M T; Broese van Groenou, M I; Deeg, D J H

    2014-04-01

    Current Dutch policy on long-term care is aimed at a stronger connection between formal home care and informal care. We examined if formal and informal caregivers of community-dwelling older adults discuss the care and whether this is related to characteristics of the older adult, the care network and the individual caregivers. Data are derived from 63 community-dwelling older adults, including their health, their perceived control of the care and their care network. In addition, 79 informal and 90 formal caregivers are interviewed on their motives and vision on caregiving. The 112 dyads between those formal and informal caregivers are the units of analysis in the current study. Bivariate analyses reveal that informal caregivers are more likely to discuss the care with formal caregivers when they are residing with the older adult, when they provide a lot of care and/or when they are strongly motivated to keep the older adult at home. This is particularly the case when the care demands are high. Characteristics of the formal caregivers were not important. In conclusion, discussion of care between non-resident informal caregivers and formal caregivers is not self-evident and requires more effort to be established.

  4. Formal faculty observation and assessment of bedside skills for 3rd-year neurology clerks

    PubMed Central

    Mooney, Christopher; Wexler, Erika; Mink, Jonathan; Post, Jennifer; Jozefowicz, Ralph F.

    2016-01-01

    Objective: To evaluate the feasibility and utility of instituting a formalized bedside skills evaluation (BSE) for 3rd-year medical students on the neurology clerkship. Methods: A neurologic BSE was developed for 3rd-year neurology clerks at the University of Rochester for the 2012–2014 academic years. Faculty directly observed 189 students completing a full history and neurologic examination on real inpatients. Mock grades were calculated utilizing the BSE in the final grade, and number of students with a grade difference was determined when compared to true grade. Correlation was explored between the BSE and clinical scores, National Board of Medical Examiners (NBME) scores, case complexity, and true final grades. A survey was administered to students to assess their clinical skills exposure and the usefulness of the BSE. Results: Faculty completed and submitted a BSE form for 88.3% of students. There was a mock final grade change for 13.2% of students. Correlation coefficients between BSE score and clinical score/NBME score were 0.36 and 0.35, respectively. A statistically significant effect of BSE was found on final clerkship grade (F2,186 = 31.9, p < 0.0001). There was no statistical difference between BSE score and differing case complexities. Conclusions: Incorporating a formal faculty-observed BSE into the 3rd year neurology clerkship was feasible. Low correlation between BSE score and other evaluations indicated a unique measurement to contribute to student grade. Using real patients with differing case complexity did not alter the grade. PMID:27770072

  5. Insight into structural phase transitions from the decoupled anharmonic mode approximation

    NASA Astrophysics Data System (ADS)

    Adams, Donat J.; Passerone, Daniele

    2016-08-01

    We develop a formalism (decoupled anharmonic mode approximation, DAMA) that allows calculation of the vibrational free energy using density functional theory even for materials which exhibit negative curvature of the potential energy surface with respect to atomic displacements. We investigate vibrational modes beyond the harmonic approximation and approximate the potential energy surface with the superposition of the accurate potential along each normal mode. We show that the free energy can stabilize crystal structures at finite temperatures which appear dynamically unstable at T  =  0. The DAMA formalism is computationally fast because it avoids statistical sampling through molecular dynamics calculations, and is in principle completely ab initio. It is free of statistical uncertainties and independent of model parameters, but can give insight into the mechanism of a structural phase transition. We apply the formalism to the perovskite cryolite, and investigate the temperature-driven phase transition from the P21/n to the Immm space group. We calculate a phase transition temperature between 710 and 950 K, in fair agreement with the experimental value of 885 K. This can be related to the underestimation of the interaction of the vibrational states. We also calculate the main axes of the thermal ellipsoid and can explain the experimentally observed increase of its volume for the fluorine by 200-300% throughout the phase transition. Our calculations suggest the appearance of tunneling states in the high temperature phase. The convergence of the vibrational DOS and of the critical temperature with respect of reciprocal space sampling is investigated using the polarizable-ion model.

  6. Insight into structural phase transitions from the decoupled anharmonic mode approximation.

    PubMed

    Adams, Donat J; Passerone, Daniele

    2016-08-03

    We develop a formalism (decoupled anharmonic mode approximation, DAMA) that allows calculation of the vibrational free energy using density functional theory even for materials which exhibit negative curvature of the potential energy surface with respect to atomic displacements. We investigate vibrational modes beyond the harmonic approximation and approximate the potential energy surface with the superposition of the accurate potential along each normal mode. We show that the free energy can stabilize crystal structures at finite temperatures which appear dynamically unstable at T  =  0. The DAMA formalism is computationally fast because it avoids statistical sampling through molecular dynamics calculations, and is in principle completely ab initio. It is free of statistical uncertainties and independent of model parameters, but can give insight into the mechanism of a structural phase transition. We apply the formalism to the perovskite cryolite, and investigate the temperature-driven phase transition from the P21/n to the Immm space group. We calculate a phase transition temperature between 710 and 950 K, in fair agreement with the experimental value of 885 K. This can be related to the underestimation of the interaction of the vibrational states. We also calculate the main axes of the thermal ellipsoid and can explain the experimentally observed increase of its volume for the fluorine by 200-300% throughout the phase transition. Our calculations suggest the appearance of tunneling states in the high temperature phase. The convergence of the vibrational DOS and of the critical temperature with respect of reciprocal space sampling is investigated using the polarizable-ion model.

  7. Formal faculty observation and assessment of bedside skills for 3rd-year neurology clerks.

    PubMed

    Thompson Stone, Robert; Mooney, Christopher; Wexler, Erika; Mink, Jonathan; Post, Jennifer; Jozefowicz, Ralph F

    2016-11-22

    To evaluate the feasibility and utility of instituting a formalized bedside skills evaluation (BSE) for 3rd-year medical students on the neurology clerkship. A neurologic BSE was developed for 3rd - year neurology clerks at the University of Rochester for the 2012-2014 academic years. Faculty directly observed 189 students completing a full history and neurologic examination on real inpatients. Mock grades were calculated utilizing the BSE in the final grade, and number of students with a grade difference was determined when compared to true grade. Correlation was explored between the BSE and clinical scores, National Board of Medical Examiners (NBME) scores, case complexity, and true final grades. A survey was administered to students to assess their clinical skills exposure and the usefulness of the BSE. Faculty completed and submitted a BSE form for 88.3% of students. There was a mock final grade change for 13.2% of students. Correlation coefficients between BSE score and clinical score/NBME score were 0.36 and 0.35, respectively. A statistically significant effect of BSE was found on final clerkship grade (F 2,186 = 31.9, p < 0.0001). There was no statistical difference between BSE score and differing case complexities. Incorporating a formal faculty-observed BSE into the 3rd year neurology clerkship was feasible. Low correlation between BSE score and other evaluations indicated a unique measurement to contribute to student grade. Using real patients with differing case complexity did not alter the grade. © 2016 American Academy of Neurology.

  8. Perceptions of registered nurses in four state health insititutions on continuing formal education.

    PubMed

    Richards, L; Potgieter, E

    2010-06-01

    This study investigated registered nurses in four selected state health institutions' perceptions with regard to continuing formal education. The relevance of continuing formal education is being emphasised globally by the increasing quest for quality assurance and quality management systems within an ethos of continuous improvement. According to Tlholoe (2006:5), it is important to be committed to continual learning, as people's knowledge become less relevant because skills gained early in a career are insufficient to avoid costly mistakes made through ignorance. Continuing formal education in nursing is a key element to the maintenance of quality in health care delivery. The study described: registered nurses' views on continuing formal education. Registered nurses' perceived barriers to continuing formal education. A quantitative descriptive survey design was chosen using a questionnaire for data collection. The sample consisted of 40 registered nurses working at four state health institutions in the Western Cape Province, South Africa. Convenience sampling was selected to include registered nurses who were on duty on the days during which the researcher visited the health institutions to distribute the questionnaires. The questionnaire contained mainly closed-ended and a few open-ended questions. Content validity of the instrument was ensured by doing a thorough literature review before construction of items and a pretest. Reliability was established by the pretest and providing the same information to all respondents before completion of the questionnaires. The ethical considerations of informed consent, anonymity and confidentiality were adhered to and consent to conduct the study was obtained from relevant authorities. Descriptive statistics, based on calculations using the Microsoft (MS) Excel (for Windows 2000) programme, were used to summarise and describe the research results. The research results indicated that most registered nurses perceive continuing formal education as beneficial to their personal and professional growth and that it could lead towards improving the quality of patient/client care, but barriers exist which prevent or deter them from undertaking continuing formal education programmes. The main structural barriers included lack of funding and lack of coherent staff development planning and physical barriers including job and family responsibilities.

  9. Sectoral networks and macroeconomic tail risks in an emerging economy.

    PubMed

    Romero, Pedro P; López, Ricardo; Jiménez, Carlos

    2018-01-01

    This paper aims to explain the macroeconomic volatility due to microeconomic shocks to one or several sectors, recognizing the non-symmetrical relation in the interaction among the Ecuadorian economic sectors. To grasp the economic structure of this emerging economy, a statistical analysis of network data is applied to the respective input-output matrix of Ecuador from 1975 until 2012. We find periods wherein the production of domestic inputs is concentrated in a few suppliers; for example, in 2010, the concentration significantly affects sectors and their downstream providers, thus influencing aggregate volatility. Compared to the US productive structure, this emerging economy presents fewer sectors and degree distributions with less extreme fat-tail behavior. In this simpler economy, we continue to find a link between microeconomic shocks and aggregate volatility. Two new theoretical propositions are introduced to formalize our results.

  10. [EVALUATION OF THE EFFECTIVENESS OF ADDITIONAL PROFESSIONAL EDUCATION ON THE BASIS OF HEALTH CARE FACILITY].

    PubMed

    Bohomaz, V M; Rymarenko, P V

    2014-01-01

    In this study we tested methods of facility learning of health care workers as part of a modern model of quality management of medical services. The statistical and qualitative analysis of the effectiveness of additional training in emergency medical care at the health facility using an adapted curriculum and special mannequins. Under the guidance of a certified instructor focus group of 53 doctors and junior medical specialists studied 22 hours. According to a survey of employees trained their level of selfassessment of knowledge and skills sigificantly increased. Also significantly increased the proportion of correct answers in a formalized testing both categories of workers. Using androgological learning model, mannequins simulators and training in small groups at work create the most favorable conditions for effective individual and group practical skills of emergency medicine.

  11. Parametric scaling from species to growth-form diversity: an interesting analogy with multifractal functions.

    PubMed

    Ricotta, Carlo; Pacini, Alessandra; Avena, Giancarlo

    2002-01-01

    We propose a measure of divergence from species to life-form diversity aimed at summarizing the ecological similarity among different plant communities without losing information on traditional taxonomic diversity. First, species and life-form relative abundances within a given plant community are determined. Next, using Rényi's generalized entropy, the diversity profiles of the analyzed community are computed both from species and life-form relative abundances. Finally, the speed of decrease from species to life-form diversity is obtained by combining the outcome of both profiles. Interestingly, the proposed measure shows some formal analogies with multifractal functions developed in statistical physics for the analysis of spatial patterns. As an application for demonstration, a small data set from a plant community sampled in the archaeological site of Paestum (southern Italy) is used.

  12. Sectoral networks and macroeconomic tail risks in an emerging economy

    PubMed Central

    López, Ricardo; Jiménez, Carlos

    2018-01-01

    This paper aims to explain the macroeconomic volatility due to microeconomic shocks to one or several sectors, recognizing the non-symmetrical relation in the interaction among the Ecuadorian economic sectors. To grasp the economic structure of this emerging economy, a statistical analysis of network data is applied to the respective input-output matrix of Ecuador from 1975 until 2012. We find periods wherein the production of domestic inputs is concentrated in a few suppliers; for example, in 2010, the concentration significantly affects sectors and their downstream providers, thus influencing aggregate volatility. Compared to the US productive structure, this emerging economy presents fewer sectors and degree distributions with less extreme fat-tail behavior. In this simpler economy, we continue to find a link between microeconomic shocks and aggregate volatility. Two new theoretical propositions are introduced to formalize our results. PMID:29293567

  13. Come bien, camina y no se preocupe--eat right, walk, and do not worry: selective biculturalism during pregnancy in a Mexican American community.

    PubMed

    Laganá, Kathleen

    2003-04-01

    Mexican American childbearing women appear to offer a healthy model for pregnancy. However, statistics suggest that they may be at increased risk for poor birth outcome as they acculturate to a U.S. lifestyle. An ethnographic study in Watsonville, California, examined the influence of acculturation on pregnancy beliefs and practices of 29 Mexican American childbearing women. Data from formal semi-structured interviews were submitted to content analysis. During pregnancy, women balanced well-documented, traditional Mexican cultural beliefs with the individualistic beliefs common to Anglo-Americans. Selective biculturalism emerged as a protective approach to stress reduction and health promotion. Stress reduction interventions as part of routine prenatal care have potential benefit for all pregnant women. Future research on cultural barriers to family-based social support during pregnancy is needed.

  14. Lead exposure and eclampsia in Britain, 1883-1934.

    PubMed

    Troesken, Werner

    2006-07-01

    Eclampsia refers to a coma or seizure activity in a pregnant woman with no prior history of such activity. This paper presents a mix of historical and epidemiological evidence consistent with the hypothesis that chronic lead exposure is a predisposing factor for eclampsia. The historical evidence is based on research conducted by British physicians around 1900 showing that the geographic variation in eclampsia across England and Wales was correlated with lead levels in local drinking water supplies. A formal epidemiological analysis based on a data set of English and Welsh counties observed in 1883 corroborates the evidence presented by historical observers. In particular, the statistical results show that the death rate from eclampsia in counties with high-water-lead levels exceeded the death rate in counties with low-water-lead levels by a factor of 2.34 (95% CI: 1.54-3.14).

  15. Jarzynski equality in the context of maximum path entropy

    NASA Astrophysics Data System (ADS)

    González, Diego; Davis, Sergio

    2017-06-01

    In the global framework of finding an axiomatic derivation of nonequilibrium Statistical Mechanics from fundamental principles, such as the maximum path entropy - also known as Maximum Caliber principle -, this work proposes an alternative derivation of the well-known Jarzynski equality, a nonequilibrium identity of great importance today due to its applications to irreversible processes: biological systems (protein folding), mechanical systems, among others. This equality relates the free energy differences between two equilibrium thermodynamic states with the work performed when going between those states, through an average over a path ensemble. In this work the analysis of Jarzynski's equality will be performed using the formalism of inference over path space. This derivation highlights the wide generality of Jarzynski's original result, which could even be used in non-thermodynamical settings such as social systems, financial and ecological systems.

  16. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.

  17. Psychometric considerations in the measurement of event-related brain potentials: Guidelines for measurement and reporting.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Failing to consider psychometric issues related to reliability and validity, differential deficits, and statistical power potentially undermines the conclusions of a study. In research using event-related brain potentials (ERPs), numerous contextual factors (population sampled, task, data recording, analysis pipeline, etc.) can impact the reliability of ERP scores. The present review considers the contextual factors that influence ERP score reliability and the downstream effects that reliability has on statistical analyses. Given the context-dependent nature of ERPs, it is recommended that ERP score reliability be formally assessed on a study-by-study basis. Recommended guidelines for ERP studies include 1) reporting the threshold of acceptable reliability and reliability estimates for observed scores, 2) specifying the approach used to estimate reliability, and 3) justifying how trial-count minima were chosen. A reliability threshold for internal consistency of at least 0.70 is recommended, and a threshold of 0.80 is preferred. The review also advocates the use of generalizability theory for estimating score dependability (the generalizability theory analog to reliability) as an improvement on classical test theory reliability estimates, suggesting that the latter is less well suited to ERP research. To facilitate the calculation and reporting of dependability estimates, an open-source Matlab program, the ERP Reliability Analysis Toolbox, is presented. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Māori identity signatures: A latent profile analysis of the types of Māori identity.

    PubMed

    Greaves, Lara M; Houkamau, Carla; Sibley, Chris G

    2015-10-01

    Māori are the indigenous peoples of New Zealand. However, the term 'Māori' can refer to a wide range of people of varying ethnic compositions and cultural identity. We present a statistical model identifying 6 distinct types, or 'Māori Identity Signatures,' and estimate their proportion in the Māori population. The model is tested using a Latent Profile Analysis of a national probability sample of 686 Māori drawn from the New Zealand Attitudes and Values Study. We identify 6 distinct signatures: Traditional Essentialists (22.6%), Traditional Inclusives (16%), High Moderates (31.7%), Low Moderates (18.7%), Spiritually Orientated (4.1%), and Disassociated (6.9%). These distinct Identity Signatures predicted variation in deprivation, age, mixed-ethnic affiliation, and religion. This research presents the first formal statistical model assessing how people's identity as Māori is psychologically structured, documents the relative proportion of these different patterns of structures, and shows that these patterns reliably predict differences in core demographics. We identify a range of patterns of Māori identity far more diverse than has been previously proposed based on qualitative data, and also show that the majority of Māori fit a moderate or traditional identity pattern. The application of our model for studying Māori health and identity development is discussed. (c) 2015 APA, all rights reserved).

  19. Relational event models for longitudinal network data with an application to interhospital patient transfers.

    PubMed

    Vu, Duy; Lomi, Alessandro; Mascia, Daniele; Pallotti, Francesca

    2017-06-30

    The main objective of this paper is to introduce and illustrate relational event models, a new class of statistical models for the analysis of time-stamped data with complex temporal and relational dependencies. We outline the main differences between recently proposed relational event models and more conventional network models based on the graph-theoretic formalism typically adopted in empirical studies of social networks. Our main contribution involves the definition and implementation of a marked point process extension of currently available models. According to this approach, the sequence of events of interest is decomposed into two components: (a) event time and (b) event destination. This decomposition transforms the problem of selection of event destination in relational event models into a conditional multinomial logistic regression problem. The main advantages of this formulation are the possibility of controlling for the effect of event-specific data and a significant reduction in the estimation time of currently available relational event models. We demonstrate the empirical value of the model in an analysis of interhospital patient transfers within a regional community of health care organizations. We conclude with a discussion of how the models we presented help to overcome some the limitations of statistical models for networks that are currently available. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Critical Analysis of the Mathematical Formalism of Theoretical Physics. I. Foundations of Differential and Integral Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2013-04-01

    Critical analysis of the standard foundations of differential and integral calculus -- as mathematical formalism of theoretical physics -- is proposed. Methodological basis of the analysis is the unity of formal logic and rational dialectics. It is shown that: (a) the foundations (i.e. d 1ptyd,;=;δ,;->;0,;δ,δ,, δ,;->;0;δ,δ,;=;δ,;->;0;f,( x;+;δ, );-;f,( x )δ,;, d,;=;δ,, d,;=;δ, where y;=;f,( x ) is a continuous function of one argument x; δ, and δ, are increments; d, and d, are differentials) not satisfy formal logic law -- the law of identity; (b) the infinitesimal quantities d,, d, are fictitious quantities. They have neither algebraic meaning, nor geometrical meaning because these quantities do not take numerical values and, therefore, have no a quantitative measure; (c) expressions of the kind x;+;d, are erroneous because x (i.e. finite quantity) and d, (i.e. infinitely diminished quantity) have different sense, different qualitative determinacy; since x;,;,,,,onst under δ,;,;,, a derivative does not contain variable quantity x and depends only on constant c. Consequently, the standard concepts ``infinitesimal quantity (uninterruptedly diminishing quantity)'', ``derivative'', ``derivative as function of variable quantity'' represent incorrect basis of mathematics and theoretical physics.

  1. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1997-01-01

    Research in the year covered by this reporting period has been primarily directed toward the following areas: (1) Formal specification of user interfaces; (2) Fault-tree analysis including software; (3) Evaluation of formal specification notations; (4) Evaluation of formal verification techniques; (5) Expanded analysis of the shell architecture concept; (6) Development of techniques to address the problem of information survivability; and (7) Development of a sophisticated tool for the manipulation of formal specifications written in Z. This report summarizes activities under the grant. The technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers. The remainder of this report is organized as follows. In the next section, an overview of the project is given. This is followed by a summary of accomplishments during the reporting period and details of students funded. Seminars presented describing work under this grant are listed in the following section, and the final section lists publications resulting from this grant.

  2. Interoperability between phenotype and anatomy ontologies.

    PubMed

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-12-15

    Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.

  3. A Logical Analysis of Quantum Voting Protocols

    NASA Astrophysics Data System (ADS)

    Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja

    2017-12-01

    In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.

  4. Formal Solutions for Polarized Radiative Transfer. II. High-order Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch

    When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.

  5. Effects of Sex Education and Kegel Exercises on the Sexual Function of Postmenopausal Women: A Randomized Clinical Trial.

    PubMed

    Nazarpour, Soheila; Simbar, Masoumeh; Ramezani Tehrani, Fahimeh; Alavi Majd, Hamid

    2017-07-01

    The sex lives of women are strongly affected by menopause. Non-pharmacologic approaches to improving the sexual function of postmenopausal women might prove effective. To compare two methods of intervention (formal sex education and Kegel exercises) with routine postmenopausal care services in a randomized clinical trial. A randomized clinical trial was conducted of 145 postmenopausal women residing in Chalus and Noshahr, Iran. Their sexual function statuses were assessed using the Female Sexual Function Index (FSFI) questionnaire. After obtaining written informed consents, they were randomly assigned to one of three groups: (i) formal sex education, (ii) Kegel exercises, or (iii) routine postmenopausal care. After 12 weeks, all participants completed the FSFI again. Analysis of covariance was used to compare the participants' sexual function before and after the interventions, and multiple linear regression analysis was used to determine the predictive factors for variation in FSFI scores in the postintervention stage. Sexual function was assessed using the FSFI. There were no statistically significant differences in demographic and socioeconomic characteristics and FSFI total scores among the three study groups at the outset of the study. After 12 weeks, the scores of arousal in the formal sex education and Kegel groups were significantly higher compared with the control group (3.38 and 3.15 vs 2.77, respectively). The scores of orgasm and satisfaction in the Kegel group were significantly higher compared with the control group (4.43 and 4.88 vs 3.95 and 4.39, respectively). Formal sex education and Kegel exercises were used as two non-pharmacologic approaches to improve the sexual function of women after menopause. The main strength of this study was its design: a well-organized randomized trial using precise eligibility criteria with a small sample loss. The second strength was the methods of intervention used, namely non-pharmacologic approaches that are simple, easily accessible, and fairly inexpensive. The main limitation of the study was our inability to objectively assess the participants' commitment to exercise and the sexual function of their partners. Sex education programs and Kegel exercises could cause improvements in some domains of sexual function-specifically arousal, orgasm, and satisfaction-in postmenopausal women. Nazarpour S, Simbar M, Tehrani FR, Majd HA. Effects of Sex Education and Kegel Exercises on the Sexual Function of Postmenopausal Women: A Randomized Clinical Trial. J Sex Med 2017;14:959-967. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  6. δ M formalism and anisotropic chaotic inflation power spectrum

    NASA Astrophysics Data System (ADS)

    Talebian-Ashkezari, A.; Ahmadi, N.

    2018-05-01

    A new analytical approach to linear perturbations in anisotropic inflation has been introduced in [A. Talebian-Ashkezari, N. Ahmadi and A.A. Abolhasani, JCAP 03 (2018) 001] under the name of δ M formalism. In this paper we apply the mentioned approach to a model of anisotropic inflation driven by a scalar field, coupled to the kinetic term of a vector field with a U(1) symmetry. The δ M formalism provides an efficient way of computing tensor-tensor, tensor-scalar as well as scalar-scalar 2-point correlations that are needed for the analysis of the observational features of an anisotropic model on the CMB. A comparison between δ M results and the tedious calculations using in-in formalism shows the aptitude of the δ M formalism in calculating accurate two point correlation functions between physical modes of the system.

  7. Rule acquisition in formal decision contexts based on formal, object-oriented and property-oriented concept lattices.

    PubMed

    Ren, Yue; Li, Jinhai; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: "if conditions 1,2,…, and m hold, then decisions hold." In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency.

  8. The role of early language abilities on math skills among Chinese children.

    PubMed

    Zhang, Juan; Fan, Xitao; Cheung, Sum Kwing; Meng, Yaxuan; Cai, Zhihui; Hu, Bi Ying

    2017-01-01

    The present study investigated the role of early language abilities in the development of math skills among Chinese K-3 students. About 2000 children in China, who were on average aged 6 years, were assessed for both informal math (e.g., basic number concepts such as counting objects) and formal math (calculations including addition and subtraction) skills, language abilities and nonverbal intelligence. Correlation analysis showed that language abilities were more strongly associated with informal than formal math skills, and regression analyses revealed that children's language abilities could uniquely predict both informal and formal math skills with age, gender, and nonverbal intelligence controlled. Mediation analyses demonstrated that the relationship between children's language abilities and formal math skills was partially mediated by informal math skills. The current findings indicate 1) Children's language abilities are of strong predictive values for both informal and formal math skills; 2) Language abilities impacts formal math skills partially through the mediation of informal math skills.

  9. The role of early language abilities on math skills among Chinese children

    PubMed Central

    Fan, Xitao; Cheung, Sum Kwing; Cai, Zhihui; Hu, Bi Ying

    2017-01-01

    Background The present study investigated the role of early language abilities in the development of math skills among Chinese K-3 students. About 2000 children in China, who were on average aged 6 years, were assessed for both informal math (e.g., basic number concepts such as counting objects) and formal math (calculations including addition and subtraction) skills, language abilities and nonverbal intelligence. Methodology Correlation analysis showed that language abilities were more strongly associated with informal than formal math skills, and regression analyses revealed that children’s language abilities could uniquely predict both informal and formal math skills with age, gender, and nonverbal intelligence controlled. Mediation analyses demonstrated that the relationship between children’s language abilities and formal math skills was partially mediated by informal math skills. Results The current findings indicate 1) Children’s language abilities are of strong predictive values for both informal and formal math skills; 2) Language abilities impacts formal math skills partially through the mediation of informal math skills. PMID:28749950

  10. Rule Acquisition in Formal Decision Contexts Based on Formal, Object-Oriented and Property-Oriented Concept Lattices

    PubMed Central

    Ren, Yue; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: “if conditions 1,2,…, and m hold, then decisions hold.” In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency. PMID:25165744

  11. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  12. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  13. Leadership for Community Engagement--A Distributed Leadership Perspective

    ERIC Educational Resources Information Center

    Liang, Jia G.; Sandmann, Lorilee R.

    2015-01-01

    This article presents distributed leadership as a framework for analysis, showing how the phenomenon complements formal higher education structures by mobilizing leadership from various sources, formal and informal. This perspective more accurately portrays the reality of leading engaged institutions. Using the application data from 224…

  14. On the formalization and reuse of scientific research.

    PubMed

    King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N

    2011-10-07

    The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.

  15. Longitudinal Effects on Early Adolescent Language: A Twin Study

    PubMed Central

    DeThorne, Laura Segebart; Smith, Jamie Mahurin; Betancourt, Mariana Aparicio; Petrill, Stephen A.

    2016-01-01

    Purpose We evaluated genetic and environmental contributions to individual differences in language skills during early adolescence, measured by both language sampling and standardized tests, and examined the extent to which these genetic and environmental effects are stable across time. Method We used structural equation modeling on latent factors to estimate additive genetic, shared environmental, and nonshared environmental effects on variance in standardized language skills (i.e., Formal Language) and productive language-sample measures (i.e., Productive Language) in a sample of 527 twins across 3 time points (mean ages 10–12 years). Results Individual differences in the Formal Language factor were influenced primarily by genetic factors at each age, whereas individual differences in the Productive Language factor were primarily due to nonshared environmental influences. For the Formal Language factor, the stability of genetic effects was high across all 3 time points. For the Productive Language factor, nonshared environmental effects showed low but statistically significant stability across adjacent time points. Conclusions The etiology of language outcomes may differ substantially depending on assessment context. In addition, the potential mechanisms for nonshared environmental influences on language development warrant further investigation. PMID:27732720

  16. A Short Biography of Paul A. M. Dirac and Historical Development of Dirac Delta Function

    ERIC Educational Resources Information Center

    Debnath, Lokenath

    2013-01-01

    This paper deals with a short biography of Paul Dirac, his first celebrated work on quantum mechanics, his first formal systematic use of the Dirac delta function and his famous work on quantum electrodynamics and quantum statistics. Included are his first discovery of the Dirac relativistic wave equation, existence of positron and the intrinsic…

  17. Formal Solutions for Polarized Radiative Transfer. III. Stiffness and Instability

    NASA Astrophysics Data System (ADS)

    Janett, Gioele; Paganini, Alberto

    2018-04-01

    Efficient numerical approximation of the polarized radiative transfer equation is challenging because this system of ordinary differential equations exhibits stiff behavior, which potentially results in numerical instability. This negatively impacts the accuracy of formal solvers, and small step-sizes are often necessary to retrieve physical solutions. This work presents stability analyses of formal solvers for the radiative transfer equation of polarized light, identifies instability issues, and suggests practical remedies. In particular, the assumptions and the limitations of the stability analysis of Runge–Kutta methods play a crucial role. On this basis, a suitable and pragmatic formal solver is outlined and tested. An insightful comparison to the scalar radiative transfer equation is also presented.

  18. Assessment of long-term impact of formal certified cardiopulmonary resuscitation training program among nurses.

    PubMed

    Saramma, P P; Raj, L Suja; Dash, P K; Sarma, P S

    2016-04-01

    Cardiopulmonary resuscitation (CPR) and emergency cardiovascular care guidelines are periodically renewed and published by the American Heart Association. Formal training programs are conducted based on these guidelines. Despite widespread training CPR is often poorly performed. Hospital educators spend a significant amount of time and money in training health professionals and maintaining basic life support (BLS) and advanced cardiac life support (ACLS) skills among them. However, very little data are available in the literature highlighting the long-term impact of these training. To evaluate the impact of formal certified CPR training program on the knowledge and skill of CPR among nurses, to identify self-reported outcomes of attempted CPR and training needs of nurses. Tertiary care hospital, Prospective, repeated-measures design. A series of certified BLS and ACLS training programs were conducted during 2010 and 2011. Written and practical performance tests were done. Final testing was undertaken 3-4 years after training. The sample included all available, willing CPR certified nurses and experience matched CPR noncertified nurses. SPSS for Windows version 21.0. The majority of the 206 nurses (93 CPR certified and 113 noncertified) were females. There was a statistically significant increase in mean knowledge level and overall performance before and after the formal certified CPR training program (P = 0.000). However, the mean knowledge scores were equivalent among the CPR certified and noncertified nurses, although the certified nurses scored a higher mean score (P = 0.140). Formal certified CPR training program increases CPR knowledge and skill. However, significant long-term effects could not be found. There is a need for regular and periodic recertification.

  19. E-assessment of prior learning: a pilot study of interactive assessment of staff with no formal education who are working in Swedish elderly care

    PubMed Central

    2014-01-01

    Background The current paper presents a pilot study of interactive assessment using information and communication technology (ICT) to evaluate the knowledge, skills and abilities of staff with no formal education who are working in Swedish elderly care. Methods Theoretical and practical assessment methods were developed and used with simulated patients and computer-based tests to identify strengths and areas for personal development among staff with no formal education. Results Of the 157 staff with no formal education, 87 began the practical and/or theoretical assessments, and 63 completed both assessments. Several of the staff passed the practical assessments, except the morning hygiene assessment, where several failed. Other areas for staff development, i.e. where several failed (>50%), were the theoretical assessment of the learning objectives: Health, Oral care, Ergonomics, hygiene, esthetic, environmental, Rehabilitation, Assistive technology, Basic healthcare and Laws and organization. None of the staff passed all assessments. Number of years working in elderly care and staff age were not statistically significantly related to the total score of grades on the various learning objectives. Conclusion The interactive assessments were useful in assessing staff members’ practical and theoretical knowledge, skills, and abilities and in identifying areas in need of development. It is important that personnel who lack formal qualifications be clearly identified and given a chance to develop their competence through training, both theoretical and practical. The interactive e-assessment approach analyzed in the present pilot study could serve as a starting point. PMID:24742168

  20. Racial and/or Ethnic Differences in Formal Sex Education and Sex Education by Parents among Young Women in the United States.

    PubMed

    Vanderberg, Rachel H; Farkas, Amy H; Miller, Elizabeth; Sucato, Gina S; Akers, Aletha Y; Borrero, Sonya B

    2016-02-01

    We sought to investigate the associations between race and/or ethnicity and young women's formal sex education and sex education by parents. Cross-sectional analysis of a nationally representative sample of 1768 women aged 15-24 years who participated in the 2011-2013 National Survey of Family Growth. We assessed 6 main outcomes: participants' report of: (1) any formal sex education; (2) formal contraceptive education; (3) formal sexually transmitted infection (STI) education; (4) any sex education by parents; (5) contraceptive education by parents; and (6) STI education by parents. The primary independent variable was self-reported race and/or ethnicity. Nearly all of participants (95%) reported any formal sex education, 68% reported formal contraceptive education, and 92% reported formal STI education. Seventy-five percent of participants reported not having any sex education by parents and only 61% and 56% reported contraceptive and STI education by parents, respectively. US-born Hispanic women were more likely than white women to report STI education by parents (adjusted odds ratio = 1.87; 95% confidence interval, 1.17-2.99). No other significant racial and/or ethnic differences in sex education were found. There are few racial and/or ethnic differences in formal sex education and sex education by parents among young women. Copyright © 2016 North American Society for Pediatric and Adolescent Gynecology. All rights reserved.

  1. Basic statistics (the fundamental concepts).

    PubMed

    Lim, Eric

    2014-12-01

    An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.

  2. Condensate statistics and thermodynamics of weakly interacting Bose gas: Recursion relation approach

    NASA Astrophysics Data System (ADS)

    Dorfman, K. E.; Kim, M.; Svidzinsky, A. A.

    2011-03-01

    We study condensate statistics and thermodynamics of weakly interacting Bose gas with a fixed total number N of particles in a cubic box. We find the exact recursion relation for the canonical ensemble partition function. Using this relation, we calculate the distribution function of condensate particles for N=200. We also calculate the distribution function based on multinomial expansion of the characteristic function. Similar to the ideal gas, both approaches give exact statistical moments for all temperatures in the framework of Bogoliubov model. We compare them with the results of unconstraint canonical ensemble quasiparticle formalism and the hybrid master equation approach. The present recursion relation can be used for any external potential and boundary conditions. We investigate the temperature dependence of the first few statistical moments of condensate fluctuations as well as thermodynamic potentials and heat capacity analytically and numerically in the whole temperature range.

  3. Formalized Conflicts Detection Based on the Analysis of Multiple Emails: An Approach Combining Statistics and Ontologies

    NASA Astrophysics Data System (ADS)

    Zakaria, Chahnez; Curé, Olivier; Salzano, Gabriella; Smaïli, Kamel

    In Computer Supported Cooperative Work (CSCW), it is crucial for project leaders to detect conflicting situations as early as possible. Generally, this task is performed manually by studying a set of documents exchanged between team members. In this paper, we propose a full-fledged automatic solution that identifies documents, subjects and actors involved in relational conflicts. Our approach detects conflicts in emails, probably the most popular type of documents in CSCW, but the methods used can handle other text-based documents. These methods rely on the combination of statistical and ontological operations. The proposed solution is decomposed in several steps: (i) we enrich a simple negative emotion ontology with terms occuring in the corpus of emails, (ii) we categorize each conflicting email according to the concepts of this ontology and (iii) we identify emails, subjects and team members involved in conflicting emails using possibilistic description logic and a set of proposed measures. Each of these steps are evaluated and validated on concrete examples. Moreover, this approach's framework is generic and can be easily adapted to domains other than conflicts, e.g. security issues, and extended with operations making use of our proposed set of measures.

  4. Impact of Temperature and Non-Gaussian Statistics on Electron Transfer in Donor-Bridge-Acceptor Molecules.

    PubMed

    Waskasi, Morteza M; Newton, Marshall D; Matyushov, Dmitry V

    2017-03-30

    A combination of experimental data and theoretical analysis provides evidence of a bell-shaped kinetics of electron transfer in the Arrhenius coordinates ln k vs 1/T. This kinetic law is a temperature analogue of the familiar Marcus bell-shaped dependence based on ln k vs the reaction free energy. These results were obtained for reactions of intramolecular charge shift between the donor and acceptor separated by a rigid spacer studied experimentally by Miller and co-workers. The non-Arrhenius kinetic law is a direct consequence of the solvent reorganization energy and reaction driving force changing approximately as hyperbolic functions with temperature. The reorganization energy decreases and the driving force increases when temperature is increased. The point of equality between them marks the maximum of the activationless reaction rate. Reaching the consistency between the kinetic and thermodynamic experimental data requires the non-Gaussian statistics of the donor-acceptor energy gap described by the Q-model of electron transfer. The theoretical formalism combines the vibrational envelope of quantum vibronic transitions with the Q-model describing the classical component of the Franck-Condon factor and a microscopic solvation model of the solvent reorganization energy and the reaction free energy.

  5. Model for interevent times with long tails and multifractality in human communications: An application to financial trading

    NASA Astrophysics Data System (ADS)

    Perelló, Josep; Masoliver, Jaume; Kasprzak, Andrzej; Kutner, Ryszard

    2008-09-01

    Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.

  6. Non-ad-hoc decision rule for the Dempster-Shafer method of evidential reasoning

    NASA Astrophysics Data System (ADS)

    Cheaito, Ali; Lecours, Michael; Bosse, Eloi

    1998-03-01

    This paper is concerned with the fusion of identity information through the use of statistical analysis rooted in Dempster-Shafer theory of evidence to provide automatic identification aboard a platform. An identity information process for a baseline Multi-Source Data Fusion (MSDF) system is defined. The MSDF system is applied to information sources which include a number of radars, IFF systems, an ESM system, and a remote track source. We use a comprehensive Platform Data Base (PDB) containing all the possible identity values that the potential target may take, and we use the fuzzy logic strategies which enable the fusion of subjective attribute information from sensor and the PDB to make the derivation of target identity more quickly, more precisely, and with statistically quantifiable measures of confidence. The conventional Dempster-Shafer lacks a formal basis upon which decision can be made in the face of ambiguity. We define a non-ad hoc decision rule based on the expected utility interval for pruning the `unessential' propositions which would otherwise overload the real-time data fusion systems. An example has been selected to demonstrate the implementation of our modified Dempster-Shafer method of evidential reasoning.

  7. REQUIREMENTS PATTERNS FOR FORMAL CONTRACTS IN ARCHITECTURAL ANALYSIS AND DESIGN LANGUAGE (AADL) MODELS

    DTIC Science & Technology

    2017-04-17

    Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed

  8. Recognition of Emotions in Autism: A Formal Meta-Analysis

    ERIC Educational Resources Information Center

    Uljarevic, Mirko; Hamilton, Antonia

    2013-01-01

    Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants…

  9. Establishing the Validity of Recovery from Stuttering without Formal Treatment.

    ERIC Educational Resources Information Center

    Finn, Patrick

    1996-01-01

    This study examined a validation procedure combining self-reports with independent verification to identify cases of recovery from stuttering without formal treatment. A Speech Behavior Checklist was administered to 42 individuals familiar with recovered subjects' past speech. Analysis of subjects' descriptions of their past stuttering was…

  10. Romanian Higher Education as a Facilitator of Romania's Continued Formal and Informal Integration in the European Union

    ERIC Educational Resources Information Center

    Salajan, Florin D.; Chiper, Sorina

    2013-01-01

    This article conducts an exploration of Romania's European integration process through higher education. It contends that integration occurs at "formal" and "informal levels" through institutional norms and human agency, respectively. Through theoretical and empirical analysis, the authors discuss the modalities through which…

  11. Leading the Teacher Team--Balancing between Formal and Informal Power in Program Leadership

    ERIC Educational Resources Information Center

    Högfeldt, Anna-Karin; Malmi, Lauri; Kinnunen, Päivi; Jerbrant, Anna; Strömberg, Emma; Berglund, Anders; Villadsen, Jørgen

    2018-01-01

    This continuous research within Nordic engineering institutions targets the contexts and possibilities for leadership among engineering education program directors. The IFP-model, developed based on analysis of interviews with program leaders in these institutions, visualizes the program director's informal and formal power. The model is presented…

  12. Effectiveness of interventions to promote help-seeking for mental health problems: systematic review and meta-analysis.

    PubMed

    Xu, Ziyan; Huang, Fangfang; Kösters, Markus; Staiger, Tobias; Becker, Thomas; Thornicroft, Graham; Rüsch, Nicolas

    2018-06-01

    Help-seeking is important to access appropriate care and improve mental health. However, individuals often delay or avoid seeking help for mental health problems. Interventions to improve help-seeking have been developed, but their effectiveness is unclear. A systematic review and meta-analysis were therefore conducted to examine the effectiveness of mental health related help-seeking interventions. Nine databases in English, German and Chinese were searched for randomised and non-randomised controlled trials. Effect sizes were calculated for attitudes, intentions and behaviours to seek formal, informal and self-help. Ninety-eight studies with 69 208 participants were included. Interventions yielded significant short-term benefits in terms of formal help-seeking, self-help, as well as mental health literacy and personal stigma. There were also positive long-term effects on formal help-seeking behaviours. The most common intervention types were strategies to increase mental health literacy, destigmatisation (both had positive short-term effects on formal help-seeking behaviours) as well as motivational enhancement (with positive long-term effects on formal help-seeking behaviours). Interventions improved formal help-seeking behaviours if delivered to people with or at risk of mental health problems, but not among children, adolescents or the general public. There was no evidence that interventions increased the use of informal help. Few studies were conducted in low- and middle-income countries (LMICs). This study provides evidence for the effectiveness of help-seeking interventions in terms of improving attitudes, intentions and behaviours to seek formal help for mental health problems among adults. Future research should develop effective interventions to improve informal help-seeking, for specific target groups and in LMICs settings.

  13. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  14. Thermodynamics of adaptive molecular resolution

    NASA Astrophysics Data System (ADS)

    Delgado-Buscalioni, R.

    2016-11-01

    A relatively general thermodynamic formalism for adaptive molecular resolution (AMR) is presented. The description is based on the approximation of local thermodynamic equilibrium and considers the alchemic parameter λ as the conjugate variable of the potential energy difference between the atomistic and coarse-grained model Φ=U(1)-U(0). The thermodynamic formalism recovers the relations obtained from statistical mechanics of H-AdResS (Español et al., J. Chem. Phys. 142, 064115, 2015 (doi:10.1063/1.4907006)) and provides relations between the free energy compensation and thermodynamic potentials. Inspired by this thermodynamic analogy, several generalizations of AMR are proposed, such as the exploration of new Maxwell relations and how to treat λ and Φ as `real' thermodynamic variables. This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  15. Mortality in the British rubber industry 1946-85.

    PubMed Central

    Sorahan, T; Parkes, H G; Veys, C A; Waterhouse, J A; Straughan, J K; Nutt, A

    1989-01-01

    The mortality experienced by a cohort of 36,691 rubber workers during 1946-85 has been investigated. These workers were all male operatives first employed in any one of the 13 participating factories in 1946-60; all had worked continuously in the industry for a minimum period of one year. Compared with the general population, statistically significant excesses relating to cancer mortality were found for cancer of the pharynx (E = 20.2, O = 30, SMR = 149), oesophagus (E = 87.6, O = 107, SMR = 122), stomach (E = 316.5, O = 359, SMR = 113), lung (E = 1219.2, O = 1592, SMR = 131), and all neoplasms (E = 2965.6, O = 3344, SMR = 113). Statistically significant deficits were found for cancer of the prostate. (E = 128.2, O = 91, SMR = 71), testis (E = 11.0, O = 4, SMR = 36), and Hodgkin's disease (E = 26.9, O = 16, SMR = 59). Involvement of occupational exposures was assessed by the method of regression models and life tables (RMLT). This method was used to compare the duration of employment in the industry, the duration in "dust exposed" jobs, and the duration in "fume and/or solvent exposed" jobs of those dying from causes of interest with those of all matching survivors. Positive associations (approaching formal levels of statistical significance) were found only for cancers of the stomach and the lung. The results of the RMLT analysis are independent of those from the SMR analysis, and the study continues to provide limited evidence of a causal association between the risks of stomach cancer and dust exposures, and the risks of lung cancer and fume or solvent exposures in the rubber industry during the period under study. PMID:2920137

  16. Prevalence of peritonitis and mortality in patients treated with continuous ambulatory peritoneal dialysis (CAPD) in Africa: a protocol for a systematic review and meta-analysis.

    PubMed

    Moloi, Mothusi Walter; Kajawo, Shepherd; Noubiap, Jean Jacques; Mbah, Ikechukwu O; Ekrikpo, Udeme; Kengne, Andre Pascal; Bello, Aminu K; Okpechi, Ikechi G

    2018-05-24

    Continuous ambulatory peritoneal dialysis (CAPD) is the ideal modality for renal replacement therapy in most African settings given that it is relatively cheaper than haemodialysis (HD) and does not require in-centre care. CAPD is, however, not readily utilised as it is often complicated by peritonitis leading to high rates of technique failure. The objective of this study is to assess the prevalence of CAPD-related peritonitis and all-cause mortality in patients treated with CAPD in Africa. We will search PubMed, EMBASE, SCOPUS, Africa Journal Online and Google Scholar for studies conducted in Africa from 1 January 1980 to 30 June 2017 with no language restrictions. Eligible studies will include cross-sectional, prospective observational and cohort studies of patients treated with CAPD. Two authors will independently screen, select studies, extract data and conduct risk of bias assessment. Data consistently reported across studies will be pooled using random-effects meta-analysis. Heterogeneity will be evaluated using Cochrane's Q statistic and quantified using I 2 statistics. Graphical and formal statistical tests will be used to assess for publication bias. Ethical approval will not be needed for this study as data used will be extracted from already published studies. Results of this review will be published in a peer-reviewed journal and presented at conferences. The Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015) framework guided the development of this protocol. CRD42017072966. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis

    PubMed Central

    Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579

  18. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis.

    PubMed

    Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.

  19. Identification and characterization of earthquake clusters: a comparative analysis for selected sequences in Italy

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2017-04-01

    Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.

  20. The relationship between organizational leadership for safety and learning from patient safety events.

    PubMed

    Ginsburg, Liane R; Chuang, You-Ta; Berta, Whitney Blair; Norton, Peter G; Ng, Peggy; Tregunno, Deborah; Richardson, Julia

    2010-06-01

    To examine the relationship between organizational leadership for patient safety and five types of learning from patient safety events (PSEs). Forty-nine general acute care hospitals in Ontario, Canada. A nonexperimental design using cross-sectional surveys of hospital patient safety officers (PSOs) and patient care managers (PCMs). PSOs provided data on organization-level learning from (a) minor events, (b) moderate events, (c) major near misses, (d) major event analysis, and (e) major event dissemination/communication. PCMs provided data on organizational leadership (formal and informal) for patient safety. Hospitals were the unit of analysis. Seemingly unrelated regression was used to examine the influence of formal and informal leadership for safety on the five types of learning from PSEs. The interaction between leadership and hospital size was also examined. Formal organizational leadership for patient safety is an important predictor of learning from minor, moderate, and major near-miss events, and major event dissemination. This relationship is significantly stronger for small hospitals (<100 beds). We find support for the relationship between patient safety leadership and patient safety behaviors such as learning from safety events. Formal leadership support for safety is of particular importance in small organizations where the economic burden of safety programs is disproportionately large and formal leadership is closer to the front lines.

  1. Linking communities to formal health care providers through village health teams in rural Uganda: lessons from linking social capital.

    PubMed

    Musinguzi, Laban Kashaija; Turinawe, Emmanueil Benon; Rwemisisi, Jude T; de Vries, Daniel H; Mafigiri, David K; Muhangi, Denis; de Groot, Marije; Katamba, Achilles; Pool, Robert

    2017-01-11

    Community-based programmes, particularly community health workers (CHWs), have been portrayed as a cost-effective alternative to the shortage of health workers in low-income countries. Usually, literature emphasises how easily CHWs link and connect communities to formal health care services. There is little evidence in Uganda to support or dispute such claims. Drawing from linking social capital framework, this paper examines the claim that village health teams (VHTs), as an example of CHWs, link and connect communities with formal health care services. Data were collected through ethnographic fieldwork undertaken as part of a larger research program in Luwero District, Uganda, between 2012 and 2014. The main methods of data collection were participant observation in events organised by VHTs. In addition, a total of 91 in-depth interviews and 42 focus group discussions (FGD) were conducted with adult community members as part of the larger project. After preliminary analysis of the data, we conducted an additional six in-depth interviews and three FGD with VHTs and four FGD with community members on the role of VHTs. Key informant interviews were conducted with local government staff, health workers, local leaders, and NGO staff with health programs in Luwero. Thematic analysis was used during data analysis. The ability of VHTs to link communities with formal health care was affected by the stakeholders' perception of their roles. Community members perceive VHTs as working for and under instructions of "others", which makes them powerless in the formal health care system. One of the challenges associated with VHTs' linking roles is support from the government and formal health care providers. Formal health care providers perceived VHTs as interested in special recognition for their services yet they are not "experts". For some health workers, the introduction of VHTs is seen as a ploy by the government to control people and hide its inability to provide health services. Having received training and initial support from an NGO, VHTs suffered transition failure from NGO to the formal public health care structure. As a result, VHTs are entangled in power relations that affect their role of linking community members with formal health care services. We also found that factors such as lack of money for treatment, poor transport networks, the attitudes of health workers and the existence of multiple health care systems, all factors that hinder access to formal health care, cannot be addressed by the VHTs. As linking social capital framework shows, for VHTs to effectively act as links between the community and formal health care and harness the resources that exist in institutions beyond the community, it is important to take into account the power relationships embedded in vertical relationships and forge a partnership between public health providers and the communities they serve. This will ensure strengthened partnerships and the improved capacity of local people to leverage resources embedded in vertical power networks.

  2. Study of the statistical physics bases on superstatistics from the β-fluctuated to the T-fluctuated form

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-04-01

    In this paper, we study the T -fluctuated form of superstatistics. In this form, some thermodynamic quantities such as the Helmholtz energy, the entropy and the internal energy, are expressed in terms of the T -fluctuated form for a canonical ensemble. In addition, the partition functions in the formalism for 2-level and 3-level distributions are derived. Then we make use of the T -fluctuated superstatistics for a quantum harmonic oscillator problem and the thermal properties of the system for three statistics of the Bose-Einstein, Maxwell-Boltzmann and Fermi-Dirac statistics are calculated. The effect of the deformation parameter on these properties is examined. All the results recover the well-known results by removing the deformation parameter.

  3. Bureau of the Census Center for International Research

    NASA Technical Reports Server (NTRS)

    Pinto, Nina Pane

    1994-01-01

    This paper describes the organization and activities of the Center for International Research at the Bureau of the Census. There is a formal publication exchange program with other government's statistical programs. This has resulted in the Center's collection being one of the world's largest in the area of international census and demographic information. Foreign statistical publications are in three libraries, one being dedicated to the former Soviet Union and one to the Peoples Republic of China. In addition to the libraries there are two computerized data bases. The International data base is a source of demographic and socio-economic statistics for all countries of the world. The second data base is the HIV/AIDS Surveillance Data Base which contains information related to the publication and dissemination of the results of seroprevalence surveys.

  4. From Informal Safety-Critical Requirements to Property-Driven Formal Validation

    NASA Technical Reports Server (NTRS)

    Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano

    2008-01-01

    Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.

  5. Steroid-antivirals treatment versus steroids alone for the treatment of Bell’s palsy: a meta-analysis

    PubMed Central

    Dong, Yabing; Zhu, Yong; Ma, Chuan; Zhao, Huaqiang

    2015-01-01

    Background: To illustrate whether the steroid-antivirals treatment could acquire a better recovery in patients with Bell’s palsy than the steroids alone treatment. Materials and methods: We conducted an exhaustive search over Pub med/Medline, Ovid, Elsevier search engines and the Cochrane library thereby collecting the randomized controlled trials in the treatment of patients with Bell’s palsy with steroid-antivirals and steroids. The qualities of relevant articles were assessed by GRADE, which was used to present the overall quality of evidence as recommended by the Cochrane Handbook for Systematic Reviews of Interventions. Results: Two investigators evaluated these papers independently, and resolved the disagreements by discussion. At last 8 eligible papers (1816 patients included: 896 treated with steroid-antivirals and 920 treated with steroids alone) match the criteria. Owing to the result (chi2 = 12.57, P = 0.08, I2 = 44%) presented by the formal test for heterogeneity, the fixed effect meta-analysis model was chosen. The facial muscle recovery between the steroids-antivirals group and the steroids alone group show significant differences (OR = 1.52, 95% CI: 1.20-1.94), while the statistical outcome of adverse effect shows no statistical significance (OR = 1.28, 95% CI: 0.71-2.31). Conclusions: The present meta-analysis indicates that the steroid-antivirals treatment could improve the recovery rate in patients with Bell’s palsy when comparing with the steroid alone treatment. Clinical significance: This meta-analysis showed that the steroid-antivirals treatment achieved the better outcomes in patients with Bell’s palsy. Clinicians should consider that steroid-antivirals therapy is an alternative choice for the patients with Bell’s palsy. PMID:25785012

  6. Use of acetaminophen and risk of endometrial cancer: evidence from observational studies.

    PubMed

    Ding, Yuan-Yuan; Yao, Peng; Verma, Surya; Han, Zhen-Kai; Hong, Tao; Zhu, Yong-Qiang; Li, Hong-Xi

    2017-05-23

    Previous meta-analyses suggested that aspirin was associated with reduced risk of endometrial cancer. However, there has been no study comprehensively summarize the evidence of acetaminophen use and risk of endometrial cancer from observational studies. We systematically searched electronic databases (PubMed , EMBASE, Web of Science, and Cochrane Library) for relevant cohort or case-control studies up to February 28, 2017. Two independent authors performed the eligibility evaluation and data extraction. All differences were resolved by discussion. A random-effects model was applied to estimate summary relative risks (RRs) with 95% CIs. All statistical tests were two-sided. Seven observational studies including four prospective cohort studies and three case-control studies with 3874 endometrial cancer cases were included for final analysis. Compared with never use acetaminophen, ever use this drug was not associated with risk of endometrial cancer (summarized RR = 1.02; 95% CI: 0.93-1.13, I2 = 0%). Similar null association was also observed when compared the highest category of frequency/duration with never use acetaminophen (summarized RR = 0.88; 95% CI: 0.70-1.11, I2 = 15.2%). Additionally, the finding was robust in the subgroup analyses stratified by study characteristics and adjustment for potential confounders and risk factors. There was no evidence of publication bias by a visual inspection of a funnel plot and formal statistical tests. In summary, the present meta-analysis reveals no association between acetaminophen use and risk of endometrial cancer. More large scale prospective cohort studies are warranted to confirm our findings and carry out the dose-response analysis of aforementioned association.

  7. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Performance Measurement and Analysis of Certain Search Algorithms

    DTIC Science & Technology

    1979-05-01

    methodology that combines experiment and analysis in complementary and highly specialized and formalized roles, and that the richness of the domains make it ... it is difficult to determine what fraction of the observed differences between the 51 two sets is due to bias in sample set 1, and what fraction simply...given by its characteristic KMIN and KMAX functions. We posit a formal model of "knowledge" itself in which there are at least as many distinct "states

  9. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  10. Photon wave function formalism for analysis of Mach–Zehnder interferometer and sum-frequency generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritboon, Atirach, E-mail: atirach.3.14@gmail.com; Department of Physics, Faculty of Science, Prince of Songkla University, Hat Yai 90112; Daengngam, Chalongrat, E-mail: chalongrat.d@psu.ac.th

    2016-08-15

    Biakynicki-Birula introduced a photon wave function similar to the matter wave function that satisfies the Schrödinger equation. Its second quantization form can be applied to investigate nonlinear optics at nearly full quantum level. In this paper, we applied the photon wave function formalism to analyze both linear optical processes in the well-known Mach–Zehnder interferometer and nonlinear optical processes for sum-frequency generation in dispersive and lossless medium. Results by photon wave function formalism agree with the well-established Maxwell treatments and existing experimental verifications.

  11. Academic productivity among fellowship associated adult total joint reconstruction surgeons.

    PubMed

    Khan, Adam Z; Kelley, Benjamin V; Patel, Ankur D; McAllister, David R; Leong, Natalie L

    2017-12-01

    The Hirsch index (h-index) is a measure that evaluates both research volume and quality-taking into consideration both publications and citations of a single author. No prior work has evaluated academic productivity and contributions to the literature of adult total joint replacement surgeons. This study uses h-index to benchmark the academic impact and identify characteristics associated with productivity of faculty members at joint replacement fellowships. Adult reconstruction fellowship programs were obtained via the American Association of Hip and Knee Surgeons website. Via the San Francisco match and program-specific websites, program characteristics (Accreditation Council for Graduate Medical Education approval, academic affiliation, region, number of fellows, fellow research requirement), associated faculty members, and faculty-specific characteristics (gender, academic title, formal fellowship training, years in practice) were obtained. H-index and total faculty publications served as primary outcome measures. Multivariable linear regression determined statistical significance. Sixty-six adult total joint reconstruction fellowship programs were identified: 30% were Accreditation Council for Graduate Medical Education approved and 73% had an academic affiliation. At these institutions, 375 adult reconstruction surgeons were identified; 98.1% were men and 85.3% had formal arthroplasty fellowship training. Average number of publications per faculty member was 50.1 (standard deviation 76.8; range 0-588); mean h-index was 12.8 (standard deviation 13.8; range 0-67). Number of fellows, faculty academic title, years in practice, and formal fellowship training had a significant ( P < .05) positive correlation with both h-index and total publications. The statistical overview presented in this work can help total joint surgeons quantitatively benchmark their academic performance against that of their peers.

  12. Relative mass distributions of neutron-rich thermally fissile nuclei within a statistical model

    NASA Astrophysics Data System (ADS)

    Kumar, Bharat; Kannan, M. T. Senthil; Balasubramaniam, M.; Agrawal, B. K.; Patra, S. K.

    2017-09-01

    We study the binary mass distribution for the recently predicted thermally fissile neutron-rich uranium and thorium nuclei using a statistical model. The level density parameters needed for the study are evaluated from the excitation energies of the temperature-dependent relativistic mean field formalism. The excitation energy and the level density parameter for a given temperature are employed in the convolution integral method to obtain the probability of the particular fragmentation. As representative cases, we present the results for the binary yields of 250U and 254Th. The relative yields are presented for three different temperatures: T =1 , 2, and 3 MeV.

  13. [Is there life beyond SPSS? Discover R].

    PubMed

    Elosua Oliden, Paula

    2009-11-01

    R is a GNU statistical and programming environment with very high graphical capabilities. It is very powerful for research purposes, but it is also an exceptional tool for teaching. R is composed of more than 1400 packages that allow using it for simple statistics and applying the most complex and most recent formal models. Using graphical interfaces like the Rcommander package, permits working in user-friendly environments which are similar to the graphical environment used by SPSS. This last characteristic allows non-statisticians to overcome the obstacle of accessibility, and it makes R the best tool for teaching. Is there anything better? Open, free, affordable, accessible and always on the cutting edge.

  14. A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies

    PubMed Central

    Tang, Li

    2014-01-01

    Summary An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this paper, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, e.g., subjects with mental disorders or neurodegenerative diseases such as Parkinson’s as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation. PMID:24033125

  15. Learning Kriging by an instructive program.

    NASA Astrophysics Data System (ADS)

    Cuador, José

    2016-04-01

    There are three types of problem classification: the deterministic, the approximated and the stochastic problems. First, in the deterministic problems the law of the phenomenon and the data are known in the entire domain and for each instant of time. In the approximated problems, the law of the phenomenon behavior is unknown but the data can be known in the entire domain and for each instant of time. In the stochastic problems much of the law and the data are unknown in the domain, so in this case the spatial behavior of the data can only be explained with probabilistic laws. This is the most important reason why the students of geo-sciences careers and others related careers need to take courses in advance estimation methods. A good example of this situation is the estimation grades in ore mineral deposit for which the Geostatistics was formalized by G. Matheron in 1962 [6]. Geostatistics is defined as the application of the theory of Random Function to the recognition and estimation of natural phenomenon [4]. Nowadays, Geostatistics is widely used in several fields of earth sciences, for example: Mining, Oil exploration, Environment, Agricultural, Forest and others [3]. It provides a wide variety of tools for spatial data analysis and allows analysing models which are subjected to degrees of uncertainty with the rigor of mathematics and formal statistical analysis [9]. Adequate models for the Kriging interpolator has been developed according to the data behavior; however there are two key steps in applying this interpolator properly: the semivariogram determination and the Kriging neighborhood selection. The main objective of this paper is to present these two elements using an instructive program.

  16. Perspectives of policy and political decision makers on access to formal dementia care: expert interviews in eight European countries.

    PubMed

    Broda, Anja; Bieber, Anja; Meyer, Gabriele; Hopper, Louise; Joyce, Rachael; Irving, Kate; Zanetti, Orazio; Portolani, Elisa; Kerpershoek, Liselot; Verhey, Frans; Vugt, Marjolein de; Wolfs, Claire; Eriksen, Siren; Røsvik, Janne; Marques, Maria J; Gonçalves-Pereira, Manuel; Sjölund, Britt-Marie; Woods, Bob; Jelley, Hannah; Orrell, Martin; Stephan, Astrid

    2017-08-03

    As part of the ActifCare (ACcess to Timely Formal Care) project, we conducted expert interviews in eight European countries with policy and political decision makers, or representatives of relevant institutions, to determine their perspectives on access to formal care for people with dementia and their carers. Each ActifCare country (Germany, Ireland, Italy, The Netherlands, Norway, Portugal, Sweden, United Kingdom) conducted semi-structured interviews with 4-7 experts (total N = 38). The interview guide addressed the topics "Complexity and Continuity of Care", "Formal Services", and "Public Awareness". Country-specific analysis of interview transcripts used an inductive qualitative content analysis. Cross-national synthesis focused on similarities in themes across the ActifCare countries. The analysis revealed ten common themes and two additional sub-themes across countries. Among others, the experts highlighted the need for a coordinating role and the necessity of information to address issues of complexity and continuity of care, demanded person-centred, tailored, and multidisciplinary formal services, and referred to education, mass media and campaigns as means to raise public awareness. Policy and political decision makers appear well acquainted with current discussions among both researchers and practitioners of possible approaches to improve access to dementia care. Experts described pragmatic, realistic strategies to influence dementia care. Suggested innovations concerned how to achieve improved dementia care, rather than transforming the nature of the services provided. Knowledge gained in these expert interviews may be useful to national decision makers when they consider reshaping the organisation of dementia care, and may thus help to develop best-practice strategies and recommendations.

  17. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    NASA Astrophysics Data System (ADS)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.

  18. The mathematical bases for qualitative reasoning

    NASA Technical Reports Server (NTRS)

    Kalagnanam, Jayant; Simon, Herbert A.; Iwasaki, Yumi

    1991-01-01

    The practices of researchers in many fields who use qualitative reasoning are summarized and explained. The goal is to gain an understanding of the formal assumptions and mechanisms that underlie this kind of analysis. The explanations given are based on standard mathematical formalisms, particularly on ordinal properties, continuous differentiable functions, and the mathematics of nonlinear dynamic systems.

  19. Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)

    2004-01-01

    These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.

  20. Communication Patterns in Normal and Disturbed Families.

    ERIC Educational Resources Information Center

    Angermeyer, Matthias C.; Hecker, Hartmut

    A study of formal communication in 30 families each with a schizophrenic son and 28 families, each with a "normal" son was conducted in Germany. By means of factor analysis four types of formal speech behavior were identified using musical terminology: "staccato," a highly fragmented flow of conversation with high turnover rate; "solo" in which…

Top