Sample records for formal statistical techniques

  1. Great Computational Intelligence in the Formal Sciences via Analogical Reasoning

    DTIC Science & Technology

    2017-05-08

    computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational

  2. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  3. The Development of Introductory Statistics Students' Informal Inferential Reasoning and Its Relationship to Formal Inferential Reasoning

    ERIC Educational Resources Information Center

    Jacob, Bridgette L.

    2013-01-01

    The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…

  4. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  5. Behavioral and Social Science Research: A National Resource. Part II.

    ERIC Educational Resources Information Center

    Adams, Robert McC., Ed.; And Others

    Areas of behavioral and social science research that have achieved significant breakthroughs in knowledge or application or that show future promise of achieving such breakthroughs are discussed in 12 papers. For example, the paper on formal demography shows how mathematical or statistical techniques can be used to explain and predict change in…

  6. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  7. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  8. JIGSAW: Preference-directed, co-operative scheduling

    NASA Technical Reports Server (NTRS)

    Linden, Theodore A.; Gaw, David

    1992-01-01

    Techniques that enable humans and machines to cooperate in the solution of complex scheduling problems have evolved out of work on the daily allocation and scheduling of Tactical Air Force resources. A generalized, formal model of these applied techniques is being developed. It is called JIGSAW by analogy with the multi-agent, constructive process used when solving jigsaw puzzles. JIGSAW begins from this analogy and extends it by propagating local preferences into global statistics that dynamically influence the value and variable ordering decisions. The statistical projections also apply to abstract resources and time periods--allowing more opportunities to find a successful variable ordering by reserving abstract resources and deferring the choice of a specific resource or time period.

  9. The boundary is mixed

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo

    2017-08-01

    We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.

  10. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  11. Putting it all together: Exhumation histories from a formal combination of heat flow and a suite of thermochronometers

    USGS Publications Warehouse

    d'Alessio, M. A.; Williams, C.F.

    2007-01-01

    A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.

  12. Formal Operations and Learning Style Predict Success in Statistics and Computer Science Courses.

    ERIC Educational Resources Information Center

    Hudak, Mary A.; Anderson, David E.

    1990-01-01

    Studies 94 undergraduate students in introductory statistics and computer science courses. Applies Formal Operations Reasoning Test (FORT) and Kolb's Learning Style Inventory (LSI). Finds that substantial numbers of students have not achieved the formal operation level of cognitive maturity. Emphasizes need to examine students learning style and…

  13. Fast alternative Monte Carlo formalism for a class of problems in biophotonics

    NASA Astrophysics Data System (ADS)

    Miller, Steven D.

    1997-12-01

    A practical and effective, alternative Monte Carlo formalism is presented that rapidly finds flux solutions to the radiative transport equation for a class of problems in biophotonics; namely, wide-beam irradiance of finite, optically anisotropic homogeneous or heterogeneous biomedias, which both strongly scatter and absorb light. Such biomedias include liver, tumors, blood, or highly blood perfused tissues. As Fermat rays comprising a wide coherent (laser) beam enter the tissue, they evolve into a bundle of random optical paths or trajectories due to scattering. Overall, this can be physically interpreted as a bundle of Markov trajectories traced out by a 'gas' of Brownian-like point photons being successively scattered and absorbed. By considering the cumulative flow of a statistical bundle of trajectories through interior data planes, the effective equivalent information of the (generally unknown) analytical flux solutions of the transfer equation rapidly emerges. Unlike the standard Monte Carlo techniques, which evaluate scalar fluence, this technique is faster, more efficient, and simpler to apply for this specific class of optical situations. Other analytical or numerical techniques can either become unwieldy or lack viability or are simply more difficult to apply. Illustrative flux calculations are presented for liver, blood, and tissue-tumor-tissue systems.

  14. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  15. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  16. Sequential Least-Squares Using Orthogonal Transformations. [spacecraft communication/spacecraft tracking-data smoothing

    NASA Technical Reports Server (NTRS)

    Bierman, G. J.

    1975-01-01

    Square root information estimation, starting from its beginnings in least-squares parameter estimation, is considered. Special attention is devoted to discussions of sensitivity and perturbation matrices, computed solutions and their formal statistics, consider-parameters and consider-covariances, and the effects of a priori statistics. The constant-parameter model is extended to include time-varying parameters and process noise, and the error analysis capabilities are generalized. Efficient and elegant smoothing results are obtained as easy consequences of the filter formulation. The value of the techniques is demonstrated by the navigation results that were obtained for the Mariner Venus-Mercury (Mariner 10) multiple-planetary space probe and for the Viking Mars space mission.

  17. Variation in reaction norms: Statistical considerations and biological interpretation.

    PubMed

    Morrissey, Michael B; Liefting, Maartje

    2016-09-01

    Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  18. BRST technique for the cosmological density matrix

    NASA Astrophysics Data System (ADS)

    Barvinsky, A. O.

    2013-10-01

    The microcanonical density matrix in closed cosmology has a natural definition as a projector on the space of solutions of Wheeler-DeWitt equations, which is motivated by the absence of global non-vanishing charges and energy in spatially closed gravitational systems. Using the BRST/BFV formalism in relativistic phase space of gauge and ghost variables we derive the path integral representation for this projector and the relevant statistical sum. This derivation circumvents the difficulties associated with the open algebra of noncommutative quantum Dirac constraints and the construction/regularization of the physical inner product in the subspace of BRS singlets. This inner product is achieved via the Batalin-Marnelius gauge fixing in the space of BRS-invariant states, which in its turn is shown to be a result of truncation of the BRST/BFV formalism to the "matter" sector of relativistic phase space.

  19. The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2002-01-01

    The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.

  20. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  1. Using Informal Inferential Reasoning to Develop Formal Concepts: Analyzing an Activity

    ERIC Educational Resources Information Center

    Weinberg, Aaron; Wiesner, Emilie; Pfaff, Thomas J.

    2010-01-01

    Inferential reasoning is a central component of statistics. Researchers have suggested that students should develop an informal understanding of the ideas that underlie inference before learning the concepts formally. This paper presents a hands-on activity that is designed to help students in an introductory statistics course draw informal…

  2. Number statistics for β-ensembles of random matrices: Applications to trapped fermions at zero temperature.

    PubMed

    Marino, Ricardo; Majumdar, Satya N; Schehr, Grégory; Vivo, Pierpaolo

    2016-09-01

    Let P_{β}^{(V)}(N_{I}) be the probability that a N×Nβ-ensemble of random matrices with confining potential V(x) has N_{I} eigenvalues inside an interval I=[a,b] on the real line. We introduce a general formalism, based on the Coulomb gas technique and the resolvent method, to compute analytically P_{β}^{(V)}(N_{I}) for large N. We show that this probability scales for large N as P_{β}^{(V)}(N_{I})≈exp[-βN^{2}ψ^{(V)}(N_{I}/N)], where β is the Dyson index of the ensemble. The rate function ψ^{(V)}(k_{I}), independent of β, is computed in terms of single integrals that can be easily evaluated numerically. The general formalism is then applied to the classical β-Gaussian (I=[-L,L]), β-Wishart (I=[1,L]), and β-Cauchy (I=[-L,L]) ensembles. Expanding the rate function around its minimum, we find that generically the number variance var(N_{I}) exhibits a nonmonotonic behavior as a function of the size of the interval, with a maximum that can be precisely characterized. These analytical results, corroborated by numerical simulations, provide the full counting statistics of many systems where random matrix models apply. In particular, we present results for the full counting statistics of zero-temperature one-dimensional spinless fermions in a harmonic trap.

  3. Spatial analysis on future housing markets: economic development and housing implications.

    PubMed

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.

  4. Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications

    PubMed Central

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097

  5. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    ERIC Educational Resources Information Center

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-01-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…

  6. On the simulation of indistinguishable fermions in the many-body Wigner formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.

    2015-01-01

    The simulation of quantum systems consisting of interacting, indistinguishable fermions is an incredible mathematical problem which poses formidable numerical challenges. Many sophisticated methods addressing this problem are available which are based on the many-body Schrödinger formalism. Recently a Monte Carlo technique for the resolution of the many-body Wigner equation has been introduced and successfully applied to the simulation of distinguishable, spinless particles. This numerical approach presents several advantages over other methods. Indeed, it is based on an intuitive formalism in which quantum systems are described in terms of a quasi-distribution function, and highly scalable due to its Monte Carlo nature.more » In this work, we extend the many-body Wigner Monte Carlo method to the simulation of indistinguishable fermions. To this end, we first show how fermions are incorporated into the Wigner formalism. Then we demonstrate that the Pauli exclusion principle is intrinsic to the formalism. As a matter of fact, a numerical simulation of two strongly interacting fermions (electrons) is performed which clearly shows the appearance of a Fermi (or exchange–correlation) hole in the phase-space, a clear signature of the presence of the Pauli principle. To conclude, we simulate 4, 8 and 16 non-interacting fermions, isolated in a closed box, and show that, as the number of fermions increases, we gradually recover the Fermi–Dirac statistics, a clear proof of the reliability of our proposed method for the treatment of indistinguishable particles.« less

  7. Evaluation of Methods Used for Estimating Selected Streamflow Statistics, and Flood Frequency and Magnitude, for Small Basins in North Coastal California

    USGS Publications Warehouse

    Mann, Michael P.; Rizzardo, Jule; Satkowski, Richard

    2004-01-01

    Accurate streamflow statistics are essential to water resource agencies involved in both science and decision-making. When long-term streamflow data are lacking at a site, estimation techniques are often employed to generate streamflow statistics. However, procedures for accurately estimating streamflow statistics often are lacking. When estimation procedures are developed, they often are not evaluated properly before being applied. Use of unevaluated or underevaluated flow-statistic estimation techniques can result in improper water-resources decision-making. The California State Water Resources Control Board (SWRCB) uses two key techniques, a modified rational equation and drainage basin area-ratio transfer, to estimate streamflow statistics at ungaged locations. These techniques have been implemented to varying degrees, but have not been formally evaluated. For estimating peak flows at the 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals, the SWRCB uses the U.S. Geological Surveys (USGS) regional peak-flow equations. In this study, done cooperatively by the USGS and SWRCB, the SWRCB estimated several flow statistics at 40 USGS streamflow gaging stations in the north coast region of California. The SWRCB estimates were made without reference to USGS flow data. The USGS used the streamflow data provided by the 40 stations to generate flow statistics that could be compared with SWRCB estimates for accuracy. While some SWRCB estimates compared favorably with USGS statistics, results were subject to varying degrees of error over the region. Flow-based estimation techniques generally performed better than rain-based methods, especially for estimation of December 15 to March 31 mean daily flows. The USGS peak-flow equations also performed well, but tended to underestimate peak flows. The USGS equations performed within reported error bounds, but will require updating in the future as peak-flow data sets grow larger. Little correlation was discovered between estimation errors and geographic locations or various basin characteristics. However, for 25-percentile year mean-daily-flow estimates for December 15 to March 31, the greatest estimation errors were at east San Francisco Bay area stations with mean annual precipitation less than or equal to 30 inches, and estimated 2-year/24-hour rainfall intensity less than 3 inches.

  8. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  9. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    PubMed

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  10. Teaching medical students ultrasound-guided vascular access - which learning method is best?

    PubMed

    Lian, Alwin; Rippey, James C R; Carr, Peter J

    2017-05-15

    Ultrasound is recommended to guide insertion of peripheral intravenous vascular cannulae (PIVC) where difficulty is experienced. Ultrasound machines are now common-place and junior doctors are often expected to be able to use them. The educational standards for this skill are highly varied, ranging from no education, to self-guided internet-based education, to formal, face-to-face traditional education. In an attempt to decide which educational technique our institution should introduce, a small pilot trial comparing educational techniques was designed. Thirty medical students were enrolled and allocated to one of three groups. PIVC placing ability was then observed, tested and graded on vascular access phantoms. The formal, face-to-face traditional education was rated best by the students, and had the highest success rate in PIVC placement, the improvement statistically significant compared to no education (p = 0.01) and trending towards significance when compared to self-directed internet-based education (p<0.06). The group receiving traditional face-to-face teaching on ultrasound-guided vascular access, performed significantly better than those not receiving education. As the number of ultrasound machines in clinical areas increases, it is important that education programs to support their safe and appropriate use are developed.

  11. Bio-Inspired Genetic Algorithms with Formalized Crossover Operators for Robotic Applications.

    PubMed

    Zhang, Jie; Kang, Man; Li, Xiaojuan; Liu, Geng-Yang

    2017-01-01

    Genetic algorithms are widely adopted to solve optimization problems in robotic applications. In such safety-critical systems, it is vitally important to formally prove the correctness when genetic algorithms are applied. This paper focuses on formal modeling of crossover operations that are one of most important operations in genetic algorithms. Specially, we for the first time formalize crossover operations with higher-order logic based on HOL4 that is easy to be deployed with its user-friendly programing environment. With correctness-guaranteed formalized crossover operations, we can safely apply them in robotic applications. We implement our technique to solve a path planning problem using a genetic algorithm with our formalized crossover operations, and the results show the effectiveness of our technique.

  12. Finding Bounded Rational Equilibria. Part 1; Iterative Focusing

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights from the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  13. The scale invariant generator technique for quantifying anisotropic scale invariance

    NASA Astrophysics Data System (ADS)

    Lewis, G. M.; Lovejoy, S.; Schertzer, D.; Pecknold, S.

    1999-11-01

    Scale invariance is rapidly becoming a new paradigm for geophysics. However, little attention has been paid to the anisotropy that is invariably present in geophysical fields in the form of differential stratification and rotation, texture and morphology. In order to account for scaling anisotropy, the formalism of generalized scale invariance (GSI) was developed. Until now there has existed only a single fairly ad hoc GSI analysis technique valid for studying differential rotation. In this paper, we use a two-dimensional representation of the linear approximation to generalized scale invariance, to obtain a much improved technique for quantifying anisotropic scale invariance called the scale invariant generator technique (SIG). The accuracy of the technique is tested using anisotropic multifractal simulations and error estimates are provided for the geophysically relevant range of parameters. It is found that the technique yields reasonable estimates for simulations with a diversity of anisotropic and statistical characteristics. The scale invariant generator technique can profitably be applied to the scale invariant study of vertical/horizontal and space/time cross-sections of geophysical fields as well as to the study of the texture/morphology of fields.

  14. Statistics of Smoothed Cosmic Fields in Perturbation Theory. I. Formulation and Useful Formulae in Second-Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Matsubara, Takahiko

    2003-02-01

    We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graesser, Jordan B; Cheriyadat, Anil M; Vatsavai, Raju

    The high rate of global urbanization has resulted in a rapid increase in informal settlements, which can be de ned as unplanned, unauthorized, and/or unstructured housing. Techniques for ef ciently mapping these settlement boundaries can bene t various decision making bodies. From a remote sensing perspective, informal settlements share unique spatial characteristics that distinguish them from other types of structures (e.g., industrial, commercial, and formal residential). These spatial characteristics are often captured in high spatial resolution satellite imagery. We analyzed the role of spatial, structural, and contextual features (e.g., GLCM, Histogram of Oriented Gradients, Line Support Regions, Lacunarity) for urbanmore » neighborhood mapping, and computed several low-level image features at multiple scales to characterize local neighborhoods. The decision parameters to classify formal-, informal-, and non-settlement classes were learned under Decision Trees and a supervised classi cation framework. Experiments were conducted on high-resolution satellite imagery from the CitySphere collection, and four different cities (i.e., Caracas, Kabul, Kandahar, and La Paz) with varying spatial characteristics were represented. Overall accuracy ranged from 85% in La Paz, Bolivia, to 92% in Kandahar, Afghanistan. While the disparities between formal and informal neighborhoods varied greatly, many of the image statistics tested proved robust.« less

  16. Granular statistical mechanics - Building on the legacy of Sir Sam Edwards

    NASA Astrophysics Data System (ADS)

    Blumenfeld, Raphael

    When Sir Sam Edwards laid down the foundations for the statistical mechanics of jammed granular materials he opened a new field in soft condensed matter and many followed. In this presentation we review briefly the Edwards formalism and some of its less discussed consequences. We point out that the formalism is useful for other classes of systems - cellular and porous materials. A certain shortcoming of the original formalism is then discussed and a modification to overcome it is proposed. Finally, a derivation of an equation of state with the new formalism is presented; the equation of state is analogous to the PVT relation for thermal gases, relating the volume, the boundary stress and measures of the structural and stress fluctuations. NUDT, Changsha, China, Imperial College London, UK, Cambridge University, UK.

  17. Social and Spill-Over Benefits as Motivating Factors to Investment in Formal Education in Africa: A Reflection around Ghanaian, Kenyan and Rwandan Contexts

    ERIC Educational Resources Information Center

    Ampofo, S. Y.; Bizimana, B.; Ndayambaje, I.; Karongo, V.; Lawrence, K. Lyn; Orodho, J. A.

    2015-01-01

    This study examined the social and spill-over benefits as motivating factors to investment in formal education in selected countries in Africa. The paper had three objectives, namely) to profile the key statistics of formal schooling; ii) examine the formal education and iii) link national goals of education with expectations in Ghana, Kenya and…

  18. Effectiveness of groundwater governance structures and institutions in Tanzania

    NASA Astrophysics Data System (ADS)

    Gudaga, J. L.; Kabote, S. J.; Tarimo, A. K. P. R.; Mosha, D. B.; Kashaigili, J. J.

    2018-05-01

    This paper examines effectiveness of groundwater governance structures and institutions in Mbarali District, Mbeya Region. The paper adopts exploratory sequential research design to collect quantitative and qualitative data. A random sample of 90 groundwater users with 50% women was involved in the survey. Descriptive statistics, Kruskal-Wallis H test and Mann-Whitney U test were used to compare the differences in responses between groups, while qualitative data were subjected to content analysis. The results show that the Village Councils and Community Water Supply Organizations (COWSOs) were effective in governing groundwater. The results also show statistical significant difference on the overall extent of effectiveness of the Village Councils in governing groundwater between villages ( P = 0.0001), yet there was no significant difference ( P > 0.05) between male and female responses on the effectiveness of Village Councils, village water committees and COWSOs. The Mann-Whitney U test showed statistical significant difference between male and female responses on effectiveness of formal and informal institutions ( P = 0.0001), such that informal institutions were effective relative to formal institutions. The Kruskal-Wallis H test also showed statistical significant difference ( P ≤ 0.05) on the extent of effectiveness of formal institutions, norms and values between low, medium and high categories. The paper concludes that COWSOs were more effective in governing groundwater than other groundwater governance structures. Similarly, norms and values were more effective than formal institutions. The paper recommends sensitization and awareness creation on formal institutions so that they can influence water users' behaviour to govern groundwater.

  19. Frequency-resolved Monte Carlo.

    PubMed

    López Carreño, Juan Camilo; Del Valle, Elena; Laussy, Fabrice P

    2018-05-03

    We adapt the Quantum Monte Carlo method to the cascaded formalism of quantum optics, allowing us to simulate the emission of photons of known energy. Statistical processing of the photon clicks thus collected agrees with the theory of frequency-resolved photon correlations, extending the range of applications based on correlations of photons of prescribed energy, in particular those of a photon-counting character. We apply the technique to autocorrelations of photon streams from a two-level system under coherent and incoherent pumping, including the Mollow triplet regime where we demonstrate the direct manifestation of leapfrog processes in producing an increased rate of two-photon emission events.

  20. Population forecasts for Bangladesh, using a Bayesian methodology.

    PubMed

    Mahsin, Md; Hossain, Syed Shahadat

    2012-12-01

    Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.

  1. Modeling, simulation, and estimation of optical turbulence

    NASA Astrophysics Data System (ADS)

    Formwalt, Byron Paul

    This dissertation documents three new contributions to simulation and modeling of optical turbulence. The first contribution is the formalization, optimization, and validation of a modeling technique called successively conditioned rendering (SCR). The SCR technique is empirically validated by comparing the statistical error of random phase screens generated with the technique. The second contribution is the derivation of the covariance delineation theorem, which provides theoretical bounds on the error associated with SCR. It is shown empirically that the theoretical bound may be used to predict relative algorithm performance. Therefore, the covariance delineation theorem is a powerful tool for optimizing SCR algorithms. For the third contribution, we introduce a new method for passively estimating optical turbulence parameters, and demonstrate the method using experimental data. The technique was demonstrated experimentally, using a 100 m horizontal path at 1.25 m above sun-heated tarmac on a clear afternoon. For this experiment, we estimated C2n ≈ 6.01 · 10-9 m-23 , l0 ≈ 17.9 mm, and L0 ≈ 15.5 m.

  2. Statistical normalization techniques for magnetic resonance imaging.

    PubMed

    Shinohara, Russell T; Sweeney, Elizabeth M; Goldsmith, Jeff; Shiee, Navid; Mateen, Farrah J; Calabresi, Peter A; Jarso, Samson; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M

    2014-01-01

    While computed tomography and other imaging techniques are measured in absolute units with physical meaning, magnetic resonance images are expressed in arbitrary units that are difficult to interpret and differ between study visits and subjects. Much work in the image processing literature on intensity normalization has focused on histogram matching and other histogram mapping techniques, with little emphasis on normalizing images to have biologically interpretable units. Furthermore, there are no formalized principles or goals for the crucial comparability of image intensities within and across subjects. To address this, we propose a set of criteria necessary for the normalization of images. We further propose simple and robust biologically motivated normalization techniques for multisequence brain imaging that have the same interpretation across acquisitions and satisfy the proposed criteria. We compare the performance of different normalization methods in thousands of images of patients with Alzheimer's disease, hundreds of patients with multiple sclerosis, and hundreds of healthy subjects obtained in several different studies at dozens of imaging centers.

  3. Finding Bounded Rational Equilibria. Part 2; Alternative Lagrangians and Uncountable Move Spaces

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights &om the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  4. Laser amplification of incoherent radiation

    NASA Technical Reports Server (NTRS)

    Menegozzi, L. N.; Lamb, W. E., Jr.

    1978-01-01

    The amplification of noise in a laser amplifier is treated theoretically. The model for the active medium and its description using density-matrix techniques, are taken from the theory of laser operation. The spectral behavior of the radiation in the nonlinear regime is studied and the formalism is written from the onset in the frequency domain. The statistics of the light are gradually modified by the nonlinear amplification process, and expressions are derived for the rate of change of fluctuations in intensity as a measure of statistical changes. In addition, the range of validity of Litvak's Gaussian-statistics approximation is discussed. In the homogeneous-broadening case, the evolution of initially broadband Gaussian radiation toward quasimonochromatic oscillations with laserlike statistics is explored in several numerical examples. The connections of this study with the time-domain work on self-pulsing in a ring-laser configuration, are established. Finally, spectral-narrowing and -rebroadening effects in Doppler-broadened media are discussed both analytically and with numerical examples. These examples show the distinct contribution of pulsations in the population ('Raman-type terms'), and saturation phenomena.

  5. Formalizing structured file services for the data storage and retrieval subsystem of the data management system for Spacestation Freedom

    NASA Technical Reports Server (NTRS)

    Jamsek, Damir A.

    1993-01-01

    A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.

  6. Decision trees in epidemiological research.

    PubMed

    Venkatasubramaniam, Ashwini; Wolfson, Julian; Mitchell, Nathan; Barnes, Timothy; JaKa, Meghan; French, Simone

    2017-01-01

    In many studies, it is of interest to identify population subgroups that are relatively homogeneous with respect to an outcome. The nature of these subgroups can provide insight into effect mechanisms and suggest targets for tailored interventions. However, identifying relevant subgroups can be challenging with standard statistical methods. We review the literature on decision trees, a family of techniques for partitioning the population, on the basis of covariates, into distinct subgroups who share similar values of an outcome variable. We compare two decision tree methods, the popular Classification and Regression tree (CART) technique and the newer Conditional Inference tree (CTree) technique, assessing their performance in a simulation study and using data from the Box Lunch Study, a randomized controlled trial of a portion size intervention. Both CART and CTree identify homogeneous population subgroups and offer improved prediction accuracy relative to regression-based approaches when subgroups are truly present in the data. An important distinction between CART and CTree is that the latter uses a formal statistical hypothesis testing framework in building decision trees, which simplifies the process of identifying and interpreting the final tree model. We also introduce a novel way to visualize the subgroups defined by decision trees. Our novel graphical visualization provides a more scientifically meaningful characterization of the subgroups identified by decision trees. Decision trees are a useful tool for identifying homogeneous subgroups defined by combinations of individual characteristics. While all decision tree techniques generate subgroups, we advocate the use of the newer CTree technique due to its simplicity and ease of interpretation.

  7. Variational Bayesian Parameter Estimation Techniques for the General Linear Model

    PubMed Central

    Starke, Ludger; Ostwald, Dirk

    2017-01-01

    Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572

  8. Organizational Decision Making

    DTIC Science & Technology

    1975-08-01

    the lack of formal techniques typically used by large organizations, digress on the advantages of formal over informal... optimization ; for example one might do a number of optimization calculations, each time using a different measure of effectiveness as the optimized ...final decision. The next level of computer application involves the use of computerized optimization techniques. Optimization

  9. Creative Process: Its Use and Extent of Formalization by Corporations.

    ERIC Educational Resources Information Center

    Fernald, Lloyd W., Jr.; Nickolenko, Pam

    1993-01-01

    This study reports creativity policies and practices used by Central Florida corporations. Survey responses (n=105) indicated that businesses are using a variety of creativity techniques with usage greater among the newer companies but that these techniques are not yet a formal part of business operations. (DB)

  10. Education Research: The challenge of incorporating formal research methodology training in a neurology residency.

    PubMed

    Leira, E C; Granner, M A; Torner, J C; Callison, R C; Adams, H P

    2008-05-13

    Physicians often do not have good understanding of research methodology. Unfortunately, the mechanism to achieve this important competency in a busy neurology residency program remains unclear. We tested the value and degree of acceptance by neurology residents of a multimodal educational intervention that consisted of biweekly teaching sessions in place of an existing journal club, as a way to provide formal training in research and statistical techniques. We used a pre- and post-test design with an educational intervention in between using neurology residents at the University of Iowa as subjects. Each test had 40 questions of research methodology. The educational intervention consisted of a biweekly, structured, topic-centered, research methodology-oriented elective seminar following a year-long predefined curriculum. An exit survey was offered to gather resident's perceptions about the course. While a majority of residents agreed that the intervention enhanced their knowledge of research methodology, only 23% attended more than 40% of the sessions. There was no difference between pretest and post-test scores (p = 0.40). Our experience suggests that, in order to accomplish the Accreditation Council for Graduate Medical Education goals regarding increasing competency of residents in knowledge about research methodology, a major restructuring in the neurology residency curriculum with more intense formal training would be necessary.

  11. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  12. From the necessary to the possible: the genesis of the spin-statistics theorem

    NASA Astrophysics Data System (ADS)

    Blum, Alexander

    2014-12-01

    The spin-statistics theorem, which relates the intrinsic angular momentum of a single particle to the type of quantum statistics obeyed by a system of many such particles, is one of the central theorems in quantum field theory and the physics of elementary particles. It was first formulated in 1939/40 by Wolfgang Pauli and his assistant Markus Fierz. This paper discusses the developments that led up to this first formulation, starting from early attempts in the late 1920s to explain why charged matter particles obey Fermi-Dirac statistics, while photons obey Bose-Einstein statistics. It is demonstrated how several important developments paved the way from such general philosophical musings to a general (and provable) theorem, most notably the use of quantum field theory, the discovery of new elementary particles, and the generalization of the notion of spin. It is also discussed how the attempts to prove a spin-statistics connection were driven by Pauli from formal to more physical arguments, culminating in Pauli's 1940 proof. This proof was a major success for the beleaguered theory of quantum field theory and the methods Pauli employed proved essential for the renaissance of quantum field theory and the development of renormalization techniques in the late 1940s.

  13. Mapping of Primary Instructional Methods and Teaching Techniques for Regularly Scheduled, Formal Teaching Sessions in an Anesthesia Residency Program.

    PubMed

    Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro

    2016-06-01

    In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high.

  14. New Technologies and Learning Environments: A Perspective from Formal and Non-Formal Education in Baja California, Mexico

    ERIC Educational Resources Information Center

    Zamora, Julieta Lopez; Reynaga, Francisco Javier Arriaga

    2010-01-01

    This paper presents results of two research works, the first approaches non-formal education and the second addresses formal education. In both studies in-depth interview techniques were used. There were some points of convergence between them on aspects such as the implementation of learning environments and the integration of ICT. The interview…

  15. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  16. Master equation theory applied to the redistribution of polarized radiation in the weak radiation field limit. V. The two-term atom

    NASA Astrophysics Data System (ADS)

    Bommier, Véronique

    2017-11-01

    Context. In previous papers of this series, we presented a formalism able to account for both statistical equilibrium of a multilevel atom and coherent and incoherent scatterings (partial redistribution). Aims: This paper provides theoretical expressions of the redistribution function for the two-term atom. This redistribution function includes both coherent (RII) and incoherent (RIII) scattering contributions with their branching ratios. Methods: The expressions were derived by applying the formalism outlined above. The statistical equilibrium equation for the atomic density matrix is first formally solved in the case of the two-term atom with unpolarized and infinitely sharp lower levels. Then the redistribution function is derived by substituting this solution for the expression of the emissivity. Results: Expressions are provided for both magnetic and non-magnetic cases. Atomic fine structure is taken into account. Expressions are also separately provided under zero and non-zero hyperfine structure. Conclusions: Redistribution functions are widely used in radiative transfer codes. In our formulation, collisional transitions between Zeeman sublevels within an atomic level (depolarizing collisions effect) are taken into account when possible (I.e., in the non-magnetic case). However, the need for a formal solution of the statistical equilibrium as a preliminary step prevents us from taking into account collisional transfers between the levels of the upper term. Accounting for these collisional transfers could be done via a numerical solution of the statistical equilibrium equation system.

  17. Influence of Culture on Secondary School Students' Understanding of Statistics: A Fijian Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2014-01-01

    Although we use statistical notions daily in making decisions, research in statistics education has focused mostly on formal statistics. Further, everyday culture may influence informal ideas of statistics. Yet, there appears to be minimal literature that deals with the educational implications of the role of culture. This paper will discuss the…

  18. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  19. Formalisms for user interface specification and design

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent J.

    1989-01-01

    The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.

  20. Geometry and Formal Linguistics.

    ERIC Educational Resources Information Center

    Huff, George A.

    This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…

  1. University of California Conference on Statistical Mechanics (4th) Held March 26-28, 1990

    DTIC Science & Technology

    1990-03-28

    and S. Lago, Chem. Phys., Z, 5750 (1983) Shear Viscosity Calculation via Equilibrium Molecular Dynamics: Einstenian vs. Green - Kubo Formalism by Adel A...through the application of the Green - Kubo approach. Although the theoretical equivalence between both formalisms was demonstrated by Helfand [3], their...like equations and of different expressions based on the Green - Kubo formalism. In contrast to Hoheisel and Vogelsang’s conclusions [2], we find that

  2. Statistical mechanics of few-particle systems: exact results for two useful models

    NASA Astrophysics Data System (ADS)

    Miranda, Enrique N.

    2017-11-01

    The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.

  3. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  4. Forensic detection of noise addition in digital images

    NASA Astrophysics Data System (ADS)

    Cao, Gang; Zhao, Yao; Ni, Rongrong; Ou, Bo; Wang, Yongbin

    2014-03-01

    We proposed a technique to detect the global addition of noise to a digital image. As an anti-forensics tool, noise addition is typically used to disguise the visual traces of image tampering or to remove the statistical artifacts left behind by other operations. As such, the blind detection of noise addition has become imperative as well as beneficial to authenticate the image content and recover the image processing history, which is the goal of general forensics techniques. Specifically, the special image blocks, including constant and strip ones, are used to construct the features for identifying noise addition manipulation. The influence of noising on blockwise pixel value distribution is formulated and analyzed formally. The methodology of detectability recognition followed by binary decision is proposed to ensure the applicability and reliability of noising detection. Extensive experimental results demonstrate the efficacy of our proposed noising detector.

  5. A New Statistic for Evaluating Item Response Theory Models for Ordinal Data. CRESST Report 839

    ERIC Educational Resources Information Center

    Cai, Li; Monroe, Scott

    2014-01-01

    We propose a new limited-information goodness of fit test statistic C[subscript 2] for ordinal IRT models. The construction of the new statistic lies formally between the M[subscript 2] statistic of Maydeu-Olivares and Joe (2006), which utilizes first and second order marginal probabilities, and the M*[subscript 2] statistic of Cai and Hansen…

  6. Hybrid optical CDMA-FSO communications network under spatially correlated gamma-gamma scintillation.

    PubMed

    Jurado-Navas, Antonio; Raddo, Thiago R; Garrido-Balsells, José María; Borges, Ben-Hur V; Olmos, Juan José Vegas; Monroy, Idelfonso Tafur

    2016-07-25

    In this paper, we propose a new hybrid network solution based on asynchronous optical code-division multiple-access (OCDMA) and free-space optical (FSO) technologies for last-mile access networks, where fiber deployment is impractical. The architecture of the proposed hybrid OCDMA-FSO network is thoroughly described. The users access the network in a fully asynchronous manner by means of assigned fast frequency hopping (FFH)-based codes. In the FSO receiver, an equal gain-combining technique is employed along with intensity modulation and direct detection. New analytical formalisms for evaluating the average bit error rate (ABER) performance are also proposed. These formalisms, based on the spatially correlated gamma-gamma statistical model, are derived considering three distinct scenarios, namely, uncorrelated, totally correlated, and partially correlated channels. Numerical results show that users can successfully achieve error-free ABER levels for the three scenarios considered as long as forward error correction (FEC) algorithms are employed. Therefore, OCDMA-FSO networks can be a prospective alternative to deliver high-speed communication services to access networks with deficient fiber infrastructure.

  7. InGaAs tunnel diodes for the calibration of semi-classical and quantum mechanical band-to-band tunneling models

    NASA Astrophysics Data System (ADS)

    Smets, Quentin; Verreck, Devin; Verhulst, Anne S.; Rooyackers, Rita; Merckling, Clément; Van De Put, Maarten; Simoen, Eddy; Vandervorst, Wilfried; Collaert, Nadine; Thean, Voon Y.; Sorée, Bart; Groeseneken, Guido; Heyns, Marc M.

    2014-05-01

    Promising predictions are made for III-V tunnel-field-effect transistor (FET), but there is still uncertainty on the parameters used in the band-to-band tunneling models. Therefore, two simulators are calibrated in this paper; the first one uses a semi-classical tunneling model based on Kane's formalism, and the second one is a quantum mechanical simulator implemented with an envelope function formalism. The calibration is done for In0.53Ga0.47As using several p+/intrinsic/n+ diodes with different intrinsic region thicknesses. The dopant profile is determined by SIMS and capacitance-voltage measurements. Error bars are used based on statistical and systematic uncertainties in the measurement techniques. The obtained parameters are in close agreement with theoretically predicted values and validate the semi-classical and quantum mechanical models. Finally, the models are applied to predict the input characteristics of In0.53Ga0.47As n- and p-lineTFET, with the n-lineTFET showing competitive performance compared to MOSFET.

  8. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  9. EFL Teachers' Formal Assessment Practices Based on Exam Papers

    ERIC Educational Resources Information Center

    Kiliçkaya, Ferit

    2016-01-01

    This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…

  10. Formal Attributes of Television Commercials: Subtle Ways of Transmitting Sex Stereotypes.

    ERIC Educational Resources Information Center

    Welch, Renate L.; And Others

    Differences in formal aspects of television commercials aimed at boys and those aimed at girls were investigated. Formal attributes were defined as production techniques such as action, pace, visual effects, dialogue and narration, background music and sound effects. Two aspects of content were also examined: aggressive behavior and the gender of…

  11. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  12. Planetary mass function and planetary systems

    NASA Astrophysics Data System (ADS)

    Dominik, M.

    2011-02-01

    With planets orbiting stars, a planetary mass function should not be seen as a low-mass extension of the stellar mass function, but a proper formalism needs to take care of the fact that the statistical properties of planet populations are linked to the properties of their respective host stars. This can be accounted for by describing planet populations by means of a differential planetary mass-radius-orbit function, which together with the fraction of stars with given properties that are orbited by planets and the stellar mass function allows the derivation of all statistics for any considered sample. These fundamental functions provide a framework for comparing statistics that result from different observing techniques and campaigns which all have their very specific selection procedures and detection efficiencies. Moreover, recent results both from gravitational microlensing campaigns and radial-velocity surveys of stars indicate that planets tend to cluster in systems rather than being the lonely child of their respective parent star. While planetary multiplicity in an observed system becomes obvious with the detection of several planets, its quantitative assessment however comes with the challenge to exclude the presence of further planets. Current exoplanet samples begin to give us first hints at the population statistics, whereas pictures of planet parameter space in its full complexity call for samples that are 2-4 orders of magnitude larger. In order to derive meaningful statistics, however, planet detection campaigns need to be designed in such a way that well-defined fully deterministic target selection, monitoring and detection criteria are applied. The probabilistic nature of gravitational microlensing makes this technique an illustrative example of all the encountered challenges and uncertainties.

  13. Correlated electron-nuclear dynamics with conditional wave functions.

    PubMed

    Albareda, Guillermo; Appel, Heiko; Franco, Ignacio; Abedi, Ali; Rubio, Angel

    2014-08-22

    The molecular Schrödinger equation is rewritten in terms of nonunitary equations of motion for the nuclei (or electrons) that depend parametrically on the configuration of an ensemble of generally defined electronic (or nuclear) trajectories. This scheme is exact and does not rely on the tracing out of degrees of freedom. Hence, the use of trajectory-based statistical techniques can be exploited to circumvent the calculation of the computationally demanding Born-Oppenheimer potential-energy surfaces and nonadiabatic coupling elements. The concept of the potential-energy surface is restored by establishing a formal connection with the exact factorization of the full wave function. This connection is used to gain insight from a simplified form of the exact propagation scheme.

  14. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

  15. Nonlinear estimation of parameters in biphasic Arrhenius plots.

    PubMed

    Puterman, M L; Hrboticky, N; Innis, S M

    1988-05-01

    This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.

  16. Investigation of pore size and energy distributions by statistical physics formalism applied to agriculture products

    NASA Astrophysics Data System (ADS)

    Aouaini, Fatma; Knani, Salah; Yahia, Manel Ben; Bahloul, Neila; Ben Lamine, Abdelmottaleb; Kechaou, Nabil

    2015-12-01

    In this paper, we present a new investigation that allows determining the pore size distribution (PSD) in a porous medium. This PSD is achieved by using the desorption isotherms of four varieties of olive leaves. This is by the means of statistical physics formalism and Kelvin's law. The results are compared with those obtained with scanning electron microscopy. The effect of temperature on the distribution function of pores has been studied. The influence of each parameter on the PSD is interpreted. A similar function of adsorption energy distribution, AED, is deduced from the PSD.

  17. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  18. A formal framework of scenario creation and analysis of extreme hydrological events

    NASA Astrophysics Data System (ADS)

    Lohmann, D.

    2007-12-01

    We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.

  19. Strategies Used by Students to Compare Two Data Sets

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2012-01-01

    One of the common tasks of inferential statistics is to compare two data sets. Long before formal statistical procedures, however, students can be encouraged to make comparisons between data sets and therefore build up intuitive statistical reasoning. Such tasks also give meaning to the data collection students may do. This study describes the…

  20. For a statistical interpretation of Helmholtz' thermal displacement

    NASA Astrophysics Data System (ADS)

    Podio-Guidugli, Paolo

    2016-11-01

    On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.

  1. On the statistical distribution in a deformed solid

    NASA Astrophysics Data System (ADS)

    Gorobei, N. N.; Luk'yanenko, A. S.

    2017-09-01

    A modification of the Gibbs distribution in a thermally insulated mechanically deformed solid, where its linear dimensions (shape parameters) are excluded from statistical averaging and included among the macroscopic parameters of state alongside with the temperature, is proposed. Formally, this modification is reduced to corresponding additional conditions when calculating the statistical sum. The shape parameters and the temperature themselves are found from the conditions of mechanical and thermal equilibria of a body, and their change is determined using the first law of thermodynamics. Known thermodynamic phenomena are analyzed for the simple model of a solid, i.e., an ensemble of anharmonic oscillators, within the proposed formalism with an accuracy of up to the first order by the anharmonicity constant. The distribution modification is considered for the classic and quantum temperature regions apart.

  2. An experiment on the impact of a neonicotinoid pesticide on honeybees: the value of a formal analysis of the data.

    PubMed

    Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T

    2017-01-01

    We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.

  3. Symbolic dynamics techniques for complex systems: Application to share price dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Dan; Beck, Christian

    2017-05-01

    The symbolic dynamics technique is well known for low-dimensional dynamical systems and chaotic maps, and lies at the roots of the thermodynamic formalism of dynamical systems. Here we show that this technique can also be successfully applied to time series generated by complex systems of much higher dimensionality. Our main example is the investigation of share price returns in a coarse-grained way. A nontrivial spectrum of Rényi entropies is found. We study how the spectrum depends on the time scale of returns, the sector of stocks considered, as well as the number of symbols used for the symbolic description. Overall our analysis confirms that in the symbol space transition probabilities of observed share price returns depend on the entire history of previous symbols, thus emphasizing the need for a modelling based on non-Markovian stochastic processes. Our method allows for quantitative comparisons of entirely different complex systems, for example the statistics of symbol sequences generated by share price returns using 4 symbols can be compared with that of genomic sequences.

  4. Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.

    PubMed

    Venturi, D; Karniadakis, G E

    2014-06-08

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.

  5. Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems

    PubMed Central

    Venturi, D.; Karniadakis, G. E.

    2014-01-01

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519

  6. Computational logic: its origins and applications.

    PubMed

    Paulson, Lawrence C

    2018-02-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.

  7. UML activity diagrams in requirements specification of logic controllers

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał

    2015-12-01

    Logic controller specification can be prepared using various techniques. One of them is the wide understandable and user-friendly UML language and its activity diagrams. Using formal methods during the design phase increases the assurance that implemented system meets the project requirements. In the approach we use the model checking technique to formally verify a specification against user-defined behavioral requirements. The properties are usually defined as temporal logic formulas. In the paper we propose to use UML activity diagrams in requirements definition and then to formalize them as temporal logic formulas. As a result, UML activity diagrams can be used both for logic controller specification and for requirements definition, what simplifies the specification and verification process.

  8. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  9. Galaxy Redshifts from Discrete Optimization of Correlation Functions

    NASA Astrophysics Data System (ADS)

    Lee, Benjamin C. G.; Budavári, Tamás; Basu, Amitabh; Rahman, Mubdi

    2016-12-01

    We propose a new method of constraining the redshifts of individual extragalactic sources based on celestial coordinates and their ensemble statistics. Techniques from integer linear programming (ILP) are utilized to optimize simultaneously for the angular two-point cross- and autocorrelation functions. Our novel formalism introduced here not only transforms the otherwise hopelessly expensive, brute-force combinatorial search into a linear system with integer constraints but also is readily implementable in off-the-shelf solvers. We adopt Gurobi, a commercial optimization solver, and use Python to build the cost function dynamically. The preliminary results on simulated data show potential for future applications to sky surveys by complementing and enhancing photometric redshift estimators. Our approach is the first application of ILP to astronomical analysis.

  10. Why interdisciplinary research enriches the study of crime. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Donnay, Karsten

    2015-03-01

    The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].

  11. Animal Social Network Theory Can Help Wildlife Conservation.

    PubMed

    Snijders, Lysanne; Blumstein, Daniel T; Stanley, Christina R; Franks, Daniel W

    2017-08-01

    Many animals preferentially associate with certain other individuals. This social structuring can influence how populations respond to changes to their environment, thus making network analysis a promising technique for understanding, predicting, and potentially manipulating population dynamics. Various network statistics can correlate with individual fitness components and key population-level processes, yet the logical role and formal application of animal social network theory for conservation and management have not been well articulated. We outline how understanding of direct and indirect relationships between animals can be profitably applied by wildlife managers and conservationists. By doing so, we aim to stimulate the development and implementation of practical tools for wildlife conservation and management and to inspire novel behavioral research in this field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Self-calibration of photometric redshift scatter in weak-lensing surveys

    DOE PAGES

    Zhang, Pengjie; Pen, Ue -Li; Bernstein, Gary

    2010-06-11

    Photo-z errors, especially catastrophic errors, are a major uncertainty for precision weak lensing cosmology. We find that the shear-(galaxy number) density and density-density cross correlation measurements between photo-z bins, available from the same lensing surveys, contain valuable information for self-calibration of the scattering probabilities between the true-z and photo-z bins. The self-calibration technique we propose does not rely on cosmological priors nor parameterization of the photo-z probability distribution function, and preserves all of the cosmological information available from shear-shear measurement. We estimate the calibration accuracy through the Fisher matrix formalism. We find that, for advanced lensing surveys such as themore » planned stage IV surveys, the rate of photo-z outliers can be determined with statistical uncertainties of 0.01-1% for z < 2 galaxies. Among the several sources of calibration error that we identify and investigate, the galaxy distribution bias is likely the most dominant systematic error, whereby photo-z outliers have different redshift distributions and/or bias than non-outliers from the same bin. This bias affects all photo-z calibration techniques based on correlation measurements. As a result, galaxy bias variations of O(0.1) produce biases in photo-z outlier rates similar to the statistical errors of our method, so this galaxy distribution bias may bias the reconstructed scatters at several-σ level, but is unlikely to completely invalidate the self-calibration technique.« less

  13. Assessing the predictors for training in management amongst hospital managers and chief executive officers: a cross-sectional study of hospitals in Abuja, Nigeria.

    PubMed

    Ochonma, Ogbonnia Godfrey; Nwatu, Stephen Ikechukwu

    2018-06-14

    There is a compelling need for management training amongst hospital managers in Nigeria mostly because management was never a part of the curricula in medical schools and this has resulted in their deficiencies in effective policymaking, planning and bottom line management. There has been no study to the best of our knowledge on the need and likely factors that may influence the acquisition of such training by hospital managers and this in effect was the reason for this study. Data for this study came from a cross-sectional survey distributed amongst management staff in twenty five (25) hospitals that were purposively selected. One hundred and twenty five (125) questionnaires were distributed, out of which one hundred and four (104) were answered and returned giving a response rate of 83.2%. Descriptive and Inferential statistics were used to summarize the results. Decisions were made at 5% level of significance. A binary logistic regression was performed on the data to predict the logit of being formally and informally trained in health management. These statistical techniques were done using the IBM SPSS version 20. The result revealed a high level of formal and informal trainings amongst the respondent managers. In formal management training, only few had no training (27.9%) while in informal management training, all had obtained a form of training of which in-service training predominates (84.6%). Most of the administrators/managers also had the intention of attending healthcare management programme within the next five years (62.5%). Socio-demographically, age (p = .032) and academic qualification (p < .001) had significant influence on training. Number of hospital beds (p < .001) and number of staff (p < .001) including managers' current designation (p < .001) also had significant influence on training. Our work did establish the critical need for both formal and informal trainings in health management for health care managers. Emphasis on training should be directed at younger managers who are the least likely to acquire such trainings, the smaller and private hospitals who are less likely to encourage such trainings amongst their staff and the least educated amongst health managers.

  14. Application of multivariate Gaussian detection theory to known non-Gaussian probability density functions

    NASA Astrophysics Data System (ADS)

    Schwartz, Craig R.; Thelen, Brian J.; Kenton, Arthur C.

    1995-06-01

    A statistical parametric multispectral sensor performance model was developed by ERIM to support mine field detection studies, multispectral sensor design/performance trade-off studies, and target detection algorithm development. The model assumes target detection algorithms and their performance models which are based on data assumed to obey multivariate Gaussian probability distribution functions (PDFs). The applicability of these algorithms and performance models can be generalized to data having non-Gaussian PDFs through the use of transforms which convert non-Gaussian data to Gaussian (or near-Gaussian) data. An example of one such transform is the Box-Cox power law transform. In practice, such a transform can be applied to non-Gaussian data prior to the introduction of a detection algorithm that is formally based on the assumption of multivariate Gaussian data. This paper presents an extension of these techniques to the case where the joint multivariate probability density function of the non-Gaussian input data is known, and where the joint estimate of the multivariate Gaussian statistics, under the Box-Cox transform, is desired. The jointly estimated multivariate Gaussian statistics can then be used to predict the performance of a target detection algorithm which has an associated Gaussian performance model.

  15. Exact Solution of the Two-Level System and the Einstein Solid in the Microcanonical Formalism

    ERIC Educational Resources Information Center

    Bertoldi, Dalia S.; Bringa, Eduardo M.; Miranda, E. N.

    2011-01-01

    The two-level system and the Einstein model of a crystalline solid are taught in every course of statistical mechanics and they are solved in the microcanonical formalism because the number of accessible microstates can be easily evaluated. However, their solutions are usually presented using the Stirling approximation to deal with factorials. In…

  16. A Comparative Study of Pre-Service Education for Preschool Teachers in China and the United States

    ERIC Educational Resources Information Center

    Gong, Xin; Wang, Pengcheng

    2017-01-01

    This study provides a comparative analysis of the pre-service education system for preschool educators in China and the United States. Based on collected data and materials (literature, policy documents, and statistical data), we compare two areas of pre-service training: (1) the formal system; (2) the informal system. In the formal system, most…

  17. Workplace Skills Taught in a Simulated Analytical Department

    NASA Astrophysics Data System (ADS)

    Sonchik Marine, Susan

    2001-11-01

    Integration of workplace skills into the academic setting is paramount for any chemical technology program. In addition to the expected chemistry content, courses must build proficiency in oral and written communication skills, computer skills, laboratory safety, and logical troubleshooting. Miami University's Chemical Technology II course is set up as a contract analytical laboratory. Students apply the advanced sampling techniques, quality assurance, standard methods, and statistical analyses they have studied. For further integration of workplace skills, weekly "department meetings" are held where the student, as members of the department, report on their work in process, present completed projects, and share what they have learned and what problems they have encountered. Information is shared between the experienced members of the department and those encountering problems or starting a new project. The instructor as department manager makes announcements, reviews company and department status, and assigns work for the coming week. The department members report results to clients in formal reports or in short memos. Factors affecting the success of the "department meeting" approach include the formality of the meeting room, use of an official agenda, the frequency, time, and duration of the meeting, and accountability of the students.

  18. An analysis on intersectional collaboration on non-communicable chronic disease prevention and control in China: a cross-sectional survey on main officials of community health service institutions.

    PubMed

    Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu

    2017-11-10

    Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management institutions and general hospitals are more active in participating in community NCD management with better collaboration score, whereas the CDC shows relatively poor collaboration in China. Xing-ming Li and Alon Rasooly have the same contribution to the paper. Xing-ming Li and Alon Rasooly listed as the same first author.

  19. An investigation of techniques for the measurement and interpretation of cosmic ray isotopic abundances. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wiedenbeck, M. E.

    1977-01-01

    An instrument, the Caltech High Energy Isotope Spectrometer Telescope was developed to measure isotopic abundances of cosmic ray nuclei by employing an energy loss - residual energy technique. A detailed analysis was made of the mass resolution capabilities of this instrument. A formalism, based on the leaky box model of cosmic ray propagation, was developed for obtaining isotopic abundance ratios at the cosmic ray sources from abundances measured in local interstellar space for elements having three or more stable isotopes, one of which is believed to be absent at the cosmic ray sources. It was shown that the dominant sources of uncertainty in the derived source ratios are uncorrelated errors in the fragmentation cross sections and statistical uncertainties in measuring local interstellar abundances. These results were applied to estimate the extent to which uncertainties must be reduced in order to distinguish between cosmic ray production in a solar-like environment and in various environments with greater neutron enrichments.

  20. Distinguishing cause from correlation in tokamak experiments to trigger edge-localised plasma instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Anthony J.; CCFE, Culham Science Centre, Abingdon OX14 3DB

    2014-11-15

    The generic question is considered: How can we determine the probability of an otherwise quasi-random event, having been triggered by an external influence? A specific problem is the quantification of the success of techniques to trigger, and hence control, edge-localised plasma instabilities (ELMs) in magnetically confined fusion (MCF) experiments. The development of such techniques is essential to ensure tolerable heat loads on components in large MCF fusion devices, and is necessary for their development into economically successful power plants. Bayesian probability theory is used to rigorously formulate the problem and to provide a formal solution. Accurate but pragmatic methods aremore » developed to estimate triggering probabilities, and are illustrated with experimental data. These allow results from experiments to be quantitatively assessed, and rigorously quantified conclusions to be formed. Example applications include assessing whether triggering of ELMs is a statistical or deterministic process, and the establishment of thresholds to ensure that ELMs are reliably triggered.« less

  1. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  2. Inference as Prediction

    ERIC Educational Resources Information Center

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  3. Computational logic: its origins and applications

    PubMed Central

    2018-01-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the ‘logic for computable functions (LCF) approach’ pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users’ code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself. PMID:29507522

  4. Aspects of First Year Statistics Students' Reasoning When Performing Intuitive Analysis of Variance: Effects of Within- and Between-Group Variability

    ERIC Educational Resources Information Center

    Trumpower, David L.

    2015-01-01

    Making inferences about population differences based on samples of data, that is, performing intuitive analysis of variance (IANOVA), is common in everyday life. However, the intuitive reasoning of individuals when making such inferences (even following statistics instruction), often differs from the normative logic of formal statistics. The…

  5. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  6. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    NASA Astrophysics Data System (ADS)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  7. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Öztürk, Hande; Noyan, I. Cevdet

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  8. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE PAGES

    Öztürk, Hande; Noyan, I. Cevdet

    2017-08-24

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  9. A brief history of numbers and statistics with cytometric applications.

    PubMed

    Watson, J V

    2001-02-15

    A brief history of numbers and statistics traces the development of numbers from prehistory to completion of our current system of numeration with the introduction of the decimal fraction by Viete, Stevin, Burgi, and Galileo at the turn of the 16th century. This was followed by the development of what we now know as probability theory by Pascal, Fermat, and Huygens in the mid-17th century which arose in connection with questions in gambling with dice and can be regarded as the origin of statistics. The three main probability distributions on which statistics depend were introduced and/or formalized between the mid-17th and early 19th centuries: the binomial distribution by Pascal; the normal distribution by de Moivre, Gauss, and Laplace, and the Poisson distribution by Poisson. The formal discipline of statistics commenced with the works of Pearson, Yule, and Gosset at the turn of the 19th century when the first statistical tests were introduced. Elementary descriptions of the statistical tests most likely to be used in conjunction with cytometric data are given and it is shown how these can be applied to the analysis of difficult immunofluorescence distributions when there is overlap between the labeled and unlabeled cell populations. Copyright 2001 Wiley-Liss, Inc.

  10. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  11. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    NASA Technical Reports Server (NTRS)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  12. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1997-01-01

    Research in the year covered by this reporting period has been primarily directed toward the following areas: (1) Formal specification of user interfaces; (2) Fault-tree analysis including software; (3) Evaluation of formal specification notations; (4) Evaluation of formal verification techniques; (5) Expanded analysis of the shell architecture concept; (6) Development of techniques to address the problem of information survivability; and (7) Development of a sophisticated tool for the manipulation of formal specifications written in Z. This report summarizes activities under the grant. The technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers. The remainder of this report is organized as follows. In the next section, an overview of the project is given. This is followed by a summary of accomplishments during the reporting period and details of students funded. Seminars presented describing work under this grant are listed in the following section, and the final section lists publications resulting from this grant.

  13. A Random Variable Approach to Nuclear Targeting and Survivability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Undem, Halvor A.

    We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less

  14. Formative Use of Intuitive Analysis of Variance

    ERIC Educational Resources Information Center

    Trumpower, David L.

    2013-01-01

    Students' informal inferential reasoning (IIR) is often inconsistent with the normative logic underlying formal statistical methods such as Analysis of Variance (ANOVA), even after instruction. In two experiments reported here, student's IIR was assessed using an intuitive ANOVA task at the beginning and end of a statistics course. In both…

  15. Potential-of-mean-force description of ionic interactions and structural hydration in biomolecular systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummer, G.; Garcia, A.E.; Soumpasis, D.M.

    1994-10-01

    To understand the functioning of living organisms on a molecular level, it is crucial to dissect the intricate interplay of the immense number of biological molecules. Most of the biochemical processes in cells occur in a liquid environment formed mainly by water and ions. This solvent environment plays an important role in biological systems. The potential-of-mean-force (PMF) formalism attempts to describe quantitatively the interactions of the solvent with biological macromolecules on the basis of an approximate statistical-mechanical representation. At its current status of development, it deals with ionic effects on the biomolecular structure and with the structural hydration of biomolecules.more » The underlying idea of the PMF formalism is to identify the dominant sources of interactions and incorporate these interactions into the theoretical formalism using PMF`s (or particle correlation functions) extracted from bulk-liquid systems. In the following, the authors shall briefly outline the statistical-mechanical foundation of the PMF formalism and introduce the PMF expansion formalism, which is intimately linked to superposition approximations for higher-order particle correlation functions. The authors shall then sketch applications, which describe the effects of the ionic environment on nucleic-acid structure. Finally, the authors shall present the more recent extension of the PMF idea to describe quantitatively the structural hydration of biomolecules. Results for the interface of ice and water and for the hydration of deoxyribonucleic acid (DNA) will be discussed.« less

  16. Gaussian-based techniques for quantum propagation from the time-dependent variational principle: Formulation in terms of trajectories of coupled classical and quantum variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shalashilin, Dmitrii V.; Burghardt, Irene

    2008-08-28

    In this article, two coherent-state based methods of quantum propagation, namely, coupled coherent states (CCS) and Gaussian-based multiconfiguration time-dependent Hartree (G-MCTDH), are put on the same formal footing, using a derivation from a variational principle in Lagrangian form. By this approach, oscillations of the classical-like Gaussian parameters and oscillations of the quantum amplitudes are formally treated in an identical fashion. We also suggest a new approach denoted here as coupled coherent states trajectories (CCST), which completes the family of Gaussian-based methods. Using the same formalism for all related techniques allows their systematization and a straightforward comparison of their mathematical structuremore » and cost.« less

  17. An elementary tutorial on formal specification and verification using PVS

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1993-01-01

    A tutorial on the development of a formal specification and its verification using the Prototype Verification System (PVS) is presented. The tutorial presents the formal specification and verification techniques by way of specific example - an airline reservation system. The airline reservation system is modeled as a simple state machine with two basic operations. These operations are shown to preserve a state invariant using the theorem proving capabilities of PVS. The technique of validating a specification via 'putative theorem proving' is also discussed and illustrated in detail. This paper is intended for the novice and assumes only some of the basic concepts of logic. A complete description of user inputs and the PVS output is provided and thus it can be effectively used while one is sitting at a computer terminal.

  18. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  19. Open and Distance Learning and Information and Communication Technologies--Implications for Formal and Non-Formal Education: A Kenyan Case

    ERIC Educational Resources Information Center

    Situma, David Barasa

    2015-01-01

    The female population in Kenya was reported at 50.05% in 2011, according to a World Bank report published in 2012. Despite this slightly higher percentage over males, women in Kenya are not well represented in education and training compared to their male counterparts (Kenya National Bureau of Statistics, 2012). The need to empower girls and women…

  20. Applying dynamic Bayesian networks to perturbed gene expression data.

    PubMed

    Dojer, Norbert; Gambin, Anna; Mizera, Andrzej; Wilczyński, Bartek; Tiuryn, Jerzy

    2006-05-08

    A central goal of molecular biology is to understand the regulatory mechanisms of gene transcription and protein synthesis. Because of their solid basis in statistics, allowing to deal with the stochastic aspects of gene expressions and noisy measurements in a natural way, Bayesian networks appear attractive in the field of inferring gene interactions structure from microarray experiments data. However, the basic formalism has some disadvantages, e.g. it is sometimes hard to distinguish between the origin and the target of an interaction. Two kinds of microarray experiments yield data particularly rich in information regarding the direction of interactions: time series and perturbation experiments. In order to correctly handle them, the basic formalism must be modified. For example, dynamic Bayesian networks (DBN) apply to time series microarray data. To our knowledge the DBN technique has not been applied in the context of perturbation experiments. We extend the framework of dynamic Bayesian networks in order to incorporate perturbations. Moreover, an exact algorithm for inferring an optimal network is proposed and a discretization method specialized for time series data from perturbation experiments is introduced. We apply our procedure to realistic simulations data. The results are compared with those obtained by standard DBN learning techniques. Moreover, the advantages of using exact learning algorithm instead of heuristic methods are analyzed. We show that the quality of inferred networks dramatically improves when using data from perturbation experiments. We also conclude that the exact algorithm should be used when it is possible, i.e. when considered set of genes is small enough.

  1. Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation.

    PubMed

    Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav

    2017-01-03

    Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.

  2. Precision, Reliability, and Effect Size of Slope Variance in Latent Growth Curve Models: Implications for Statistical Power Analysis

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; Ghisletta, Paolo; Lindenberger, Ulman; Hertzog, Christopher

    2018-01-01

    Latent Growth Curve Models (LGCM) have become a standard technique to model change over time. Prediction and explanation of inter-individual differences in change are major goals in lifespan research. The major determinants of statistical power to detect individual differences in change are the magnitude of true inter-individual differences in linear change (LGCM slope variance), design precision, alpha level, and sample size. Here, we show that design precision can be expressed as the inverse of effective error. Effective error is determined by instrument reliability and the temporal arrangement of measurement occasions. However, it also depends on another central LGCM component, the variance of the latent intercept and its covariance with the latent slope. We derive a new reliability index for LGCM slope variance—effective curve reliability (ECR)—by scaling slope variance against effective error. ECR is interpretable as a standardized effect size index. We demonstrate how effective error, ECR, and statistical power for a likelihood ratio test of zero slope variance formally relate to each other and how they function as indices of statistical power. We also provide a computational approach to derive ECR for arbitrary intercept-slope covariance. With practical use cases, we argue for the complementary utility of the proposed indices of a study's sensitivity to detect slope variance when making a priori longitudinal design decisions or communicating study designs. PMID:29755377

  3. Differentially Private Synthesization of Multi-Dimensional Data using Copula Functions

    PubMed Central

    Li, Haoran; Xiong, Li; Jiang, Xiaoqian

    2014-01-01

    Differential privacy has recently emerged in private statistical data release as one of the strongest privacy guarantees. Most of the existing techniques that generate differentially private histograms or synthetic data only work well for single dimensional or low-dimensional histograms. They become problematic for high dimensional and large domain data due to increased perturbation error and computation complexity. In this paper, we propose DPCopula, a differentially private data synthesization technique using Copula functions for multi-dimensional data. The core of our method is to compute a differentially private copula function from which we can sample synthetic data. Copula functions are used to describe the dependence between multivariate random vectors and allow us to build the multivariate joint distribution using one-dimensional marginal distributions. We present two methods for estimating the parameters of the copula functions with differential privacy: maximum likelihood estimation and Kendall’s τ estimation. We present formal proofs for the privacy guarantee as well as the convergence property of our methods. Extensive experiments using both real datasets and synthetic datasets demonstrate that DPCopula generates highly accurate synthetic multi-dimensional data with significantly better utility than state-of-the-art techniques. PMID:25405241

  4. [Repetitive transcranial magnetic stimulation: A potential therapy for cognitive disorders?

    PubMed

    Nouhaud, C; Sherrard, R M; Belmin, J

    2017-03-01

    Considering the limited effectiveness of drugs treatments in cognitive disorders, the emergence of noninvasive techniques to modify brain function is very interesting. Among these techniques, repetitive transcranial magnetic stimulation (rTMS) can modulate cortical excitability and have potential therapeutic effects on cognition and behaviour. These effects are due to physiological modifications in the stimulated cortical tissue and their associated circuits, which depend on the parameters of stimulation. The objective of this article is to specify current knowledge and efficacy of rTMS in cognitive disorders. Previous studies found very encouraging results with significant improvement of higher brain functions. Nevertheless, these few studies have limits: a few patients were enrolled, the lack of control of the mechanisms of action by brain imaging, insufficiently formalized technique and variability of cognitive tests. It is therefore necessary to perform more studies, which identify statistical significant improvement and to specify underlying mechanisms of action and the parameters of use of the rTMS to offer rTMS as a routine therapy for cognitive dysfunction. Copyright © 2016 Société Nationale Française de Médecine Interne (SNFMI). Published by Elsevier SAS. All rights reserved.

  5. On the Need for Practical Formal Methods

    DTIC Science & Technology

    1998-01-01

    additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented

  6. Webuye Health and Demographic Surveillance Systems Baseline Survey of Soil-Transmitted Helminths and Intestinal Protozoa among Children up to Five Years

    PubMed Central

    Obala, A. A.; Simiyu, C. J.; Odhiambo, D. O.; Nanyu, V.; Chege, P.; Downing, R.; Mwaliko, E.; Mwangi, A. W.; Menya, D.; Chelagat, D.; Nyamogoba, H. D. N.; Ayuo, P. O.; O'Meara, W. P.; Twagirumukiza, M.; Vandenbroek, D.; Otsyula, B. B. O.; de Maeseneer, J.

    2013-01-01

    Background. The intestinal parasitic infections (IPIs) are globally endemic, and they constitute the greatest cause of illness and disease worldwide. Transmission of IPIs occurs as a result of inadequate sanitation, inaccessibility to potable water, and poor living conditions. Objectives. To determine a baseline prevalence of IPIs among children of five years and below at Webuye Health and Demographic Surveillance (HDSS) area in western Kenya. Methods. Cross-sectional survey was used to collect data. Direct saline and formal-ether-sedimentation techniques were used to process the specimens. Descriptive and inferential statistics such as Chi-square statistics were used to analyze the data. Results. A prevalence of 52.3% (417/797) was obtained with the male child slightly more infected than the female (53.5% versus 51%), but this was not significant (χ 2 = 0.482, P > 0.05). Giardia lamblia and Entamoeba histolytica were the most common pathogenic IPIs with a prevalence of 26.1% (208/797) and 11.2% (89/797), respectively. Soil-transmitted helminths (STHs) were less common with a prevalence of 4.8% (38/797), 3.8% (30/797), and 0.13% (1/797) for Ascaris lumbricoides, hookworms, and Trichuris trichiura, respectively. Conclusions. Giardia lamblia and E. histolytica were the most prevalent pathogenic intestinal protozoa, while STHs were less common. Community-based health promotion techniques are recommended for controlling these parasites. PMID:23533444

  7. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  8. A rigorous approach to self-checking programming

    NASA Technical Reports Server (NTRS)

    Hua, Kien A.; Abraham, Jacob A.

    1986-01-01

    Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.

  9. A High-Level Language for Modeling Algorithms and Their Properties

    NASA Astrophysics Data System (ADS)

    Akhtar, Sabina; Merz, Stephan; Quinson, Martin

    Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.

  10. Universal calculational recipe for solvent-mediated potential: based on a combination of integral equation theory and density functional theory

    NASA Astrophysics Data System (ADS)

    Zhou, Shiqi

    2004-07-01

    A universal formalism, which enables calculation of solvent-mediated potential (SMP) between two equal or non-equal solute particles with any shape immersed in solvent reservior consisting of atomic particle and/or polymer chain or their mixture, is proposed by importing a density functional theory externally into OZ equation systems. Only if size asymmetry of the solvent bath components is moderate, the present formalism can calculate the SMP in any complex fluids at the present development stage of statistical mechanics, and therefore avoids all of limitations of previous approaches for SMP. Preliminary calculation indicates the reliability of the present formalism.

  11. Statistical mechanics of the Huxley-Simmons model

    NASA Astrophysics Data System (ADS)

    Caruel, M.; Truskinovsky, L.

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  12. On the statistical significance of excess events: Remarks of caution and the need for a standard method of calculation

    NASA Technical Reports Server (NTRS)

    Staubert, R.

    1985-01-01

    Methods for calculating the statistical significance of excess events and the interpretation of the formally derived values are discussed. It is argued that a simple formula for a conservative estimate should generally be used in order to provide a common understanding of quoted values.

  13. Blended particle filters for large-dimensional chaotic dynamical systems

    PubMed Central

    Majda, Andrew J.; Qi, Di; Sapsis, Themistoklis P.

    2014-01-01

    A major challenge in contemporary data science is the development of statistically accurate particle filters to capture non-Gaussian features in large-dimensional chaotic dynamical systems. Blended particle filters that capture non-Gaussian features in an adaptively evolving low-dimensional subspace through particles interacting with evolving Gaussian statistics on the remaining portion of phase space are introduced here. These blended particle filters are constructed in this paper through a mathematical formalism involving conditional Gaussian mixtures combined with statistically nonlinear forecast models compatible with this structure developed recently with high skill for uncertainty quantification. Stringent test cases for filtering involving the 40-dimensional Lorenz 96 model with a 5-dimensional adaptive subspace for nonlinear blended filtering in various turbulent regimes with at least nine positive Lyapunov exponents are used here. These cases demonstrate the high skill of the blended particle filter algorithms in capturing both highly non-Gaussian dynamical features as well as crucial nonlinear statistics for accurate filtering in extreme filtering regimes with sparse infrequent high-quality observations. The formalism developed here is also useful for multiscale filtering of turbulent systems and a simple application is sketched below. PMID:24825886

  14. Identification and characterization of earthquake clusters: a comparative analysis for selected sequences in Italy

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2017-04-01

    Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.

  15. Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.

  16. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  17. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  18. Informal Evaluation.

    ERIC Educational Resources Information Center

    Engel, Brenda S.

    Intended for non-experts in evaluative techniques, this monograph presents suggestions and examples for assessing: (1) the child; (2) the classroom; and (3) the program or the school. Illustrative techniques of recordkeeping are presented. Methods of collecting data include documentation and formal records. Techniques to be used during evaluation…

  19. Hunting Solomonoff's Swans: Exploring the Boundary Between Physics and Statistics in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2014-12-01

    Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.

  20. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  1. An Ill-Structured PBL-Based Microprocessor Course without Formal Laboratory

    ERIC Educational Resources Information Center

    Kim, Jungkuk

    2012-01-01

    This paper introduces a problem-based learning (PBL) microprocessor application course designed according to the following strategies: 1) hands-on training without having a formal laboratory, and 2) intense student-centered cooperative learning through an ill-structured problem. PBL was adopted as the core educational technique of the course to…

  2. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  3. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  4. Bedside Ultrasound in the Emergency Department to Detect Hydronephrosis for the Evaluation of Suspected Ureteric Colic.

    PubMed

    Shrestha, R; Shakya, R M; Khan A, A

    2016-01-01

    Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.

  5. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  6. A round-robin gamma stereotactic radiosurgery dosimetry interinstitution comparison of calibration protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drzymala, R. E., E-mail: drzymala@wustl.edu; Alvarez, P. E.; Bednarz, G.

    2015-11-15

    Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods:more » Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS dose-rate were 0.999 ± 0.009 (TG-21), 0.991 ± 0.013 (TG-51), 1.000 ± 0.009 (IAEA), and 1.009 ± 0.012 (in-air). There were no statistically significant differences (i.e., p > 0.05) between the two ionization chambers for the TG-21 protocol applied to all dosimetry phantoms. The mean results using the TG-51 protocol were notably lower than those for the other dosimetry protocols, with a standard deviation 2–3 times larger. The in-air protocol was not statistically different from TG-21 for the A16 chamber in the liquid water or ABS phantoms (p = 0.300 and p = 0.135) but was statistically different from TG-21 for the PTW chamber in all phantoms (p = 0.006 for Solid Water, 0.014 for liquid water, and 0.020 for ABS). Results of IAEA formalism were statistically different from TG-21 results only for the combination of the A16 chamber with the liquid water phantom (p = 0.017). In the latter case, dose-rates measured with the two protocols differed by only 0.4%. For other phantom-ionization-chamber combinations, the new IAEA formalism was not statistically different from TG-21. Conclusions: Although further investigation is needed to validate the new protocols for other ionization chambers, these results can serve as a reference to quantitatively compare different calibration protocols and ionization chambers if a particular method is chosen by a professional society to serve as a standardized calibration protocol.« less

  7. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  8. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  9. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  10. How Framing Statistical Statements Affects Subjective Veracity: Validation and Application of a Multinomial Model for Judgments of Truth

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.

    2012-01-01

    Extending the well-established negativity bias in human cognition to truth judgments, it was recently shown that negatively framed statistical statements are more likely to be considered true than formally equivalent statements framed positively. However, the underlying processes responsible for this effect are insufficiently understood.…

  11. Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design

    ERIC Educational Resources Information Center

    Bennett, Kimberley Ann

    2015-01-01

    Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…

  12. Determining the Number of Component Clusters in the Standard Multivariate Normal Mixture Model Using Model-Selection Criteria.

    DTIC Science & Technology

    1983-06-16

    has been advocated by Gnanadesikan and 𔃾ilk (1969), and others in the literature. This suggests that, if we use the formal signficance test type...American Statistical Asso., 62, 1159-1178. Gnanadesikan , R., and Wilk, M..B. (1969). Data Analytic Methods in Multi- variate Statistical Analysis. In

  13. Spin formalism and applications to new physics searches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haber, H.E.

    1994-12-01

    An introduction to spin techniques in particle physics is given. Among the topics covered are: helicity formalism and its applications to the decay and scattering of spin-1/2 and spin-1 particles, techniques for evaluating helicity amplitudes (including projection operator methods and the spinor helicity method), and density matrix techniques. The utility of polarization and spin correlations for untangling new physics beyond the Standard Model at future colliders such as the LHC and a high energy e{sup +}e{sup {minus}} linear collider is then considered. A number of detailed examples are explored including the search for low-energy supersymmetry, a non-minimal Higgs boson sector,more » and new gauge bosons beyond the W{sup {+-}} and Z.« less

  14. Modeling of tool path for the CNC sheet cutting machines

    NASA Astrophysics Data System (ADS)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  15. Investigation on the use of optimization techniques for helicopter airframe vibrations design studies

    NASA Technical Reports Server (NTRS)

    Sreekanta Murthy, T.

    1992-01-01

    Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.

  16. Statistics of primordial density perturbations from discrete seed masses

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.; Bertschinger, Edmund

    1991-01-01

    The statistics of density perturbations for general distributions of seed masses with arbitrary matter accretion is examined. Formal expressions for the power spectrum, the N-point correlation functions, and the density distribution function are derived. These results are applied to the case of uncorrelated seed masses, and power spectra are derived for accretion of both hot and cold dark matter plus baryons. The reduced moments (cumulants) of the density distribution are computed and used to obtain a series expansion for the density distribution function. Analytic results are obtained for the density distribution function in the case of a distribution of seed masses with a spherical top-hat accretion pattern. More generally, the formalism makes it possible to give a complete characterization of the statistical properties of any random field generated from a discrete linear superposition of kernels. In particular, the results can be applied to density fields derived by smoothing a discrete set of points with a window function.

  17. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    NASA Astrophysics Data System (ADS)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-08-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.

  18. Superstatistics with different kinds of distributions in the deformed formalism

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-03-01

    In this article, after first introducing superstatistics, the effective Boltzmann factor in a deformed formalism for modified Dirac delta, uniform, two-level and Gamma distributions is derived. Then we make use of the superstatistics for four important problems in physics and the thermodynamic properties of the system are calculated. All results in the limit case are reduced to ordinary statistical mechanics. Furthermore, effects of all parameters in the problems are calculated and shown graphically.

  19. Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric; Duraisamy, Karthk

    2017-11-01

    The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  20. An efficient method for removing point sources from full-sky radio interferometric maps

    NASA Astrophysics Data System (ADS)

    Berger, Philippe; Oppermann, Niels; Pen, Ue-Li; Shaw, J. Richard

    2017-12-01

    A new generation of wide-field radio interferometers designed for 21-cm surveys is being built as drift scan instruments allowing them to observe large fractions of the sky. With large numbers of antennas and frequency channels, the enormous instantaneous data rates of these telescopes require novel, efficient, data management and analysis techniques. The m-mode formalism exploits the periodicity of such data with the sidereal day, combined with the assumption of statistical isotropy of the sky, to achieve large computational savings and render optimal analysis methods computationally tractable. We present an extension to that work that allows us to adopt a more realistic sky model and treat objects such as bright point sources. We develop a linear procedure for deconvolving maps, using a Wiener filter reconstruction technique, which simultaneously allows filtering of these unwanted components. We construct an algorithm, based on the Sherman-Morrison-Woodbury formula, to efficiently invert the data covariance matrix, as required for any optimal signal-to-noise ratio weighting. The performance of our algorithm is demonstrated using simulations of a cylindrical transit telescope.

  1. A demonstration of a transportable radio interferometric surveying system with 3-cm accuracy on a 307-m base line

    NASA Technical Reports Server (NTRS)

    Ong, K. M.; Macdoran, P. F.; Thomas, J. B.; Fliegel, H. F.; Skjerve, L. J.; Spitzmesser, D. J.; Batelaan, P. D.; Paine, S. R.; Newsted, M. G.

    1976-01-01

    A precision geodetic measurement system (Aries, for Astronomical Radio Interferometric Earth Surveying) based on the technique of very long base line interferometry has been designed and implemented through the use of a 9-m transportable antenna and the NASA 64-m antenna of the Deep Space Communications Complex at Goldstone, California. A series of experiments designed to demonstrate the inherent accuracy of a transportable interferometer was performed on a 307-m base line during the period from December 1973 to June 1974. This short base line was chosen in order to obtain a comparison with a conventional survey with a few-centimeter accuracy and to minimize Aries errors due to transmission media effects, source locations, and earth orientation parameters. The base-line vector derived from a weighted average of the measurements, representing approximately 24 h of data, possessed a formal uncertainty of about 3 cm in all components. This average interferometry base-line vector was in good agreement with the conventional survey vector within the statistical range allowed by the combined uncertainties (3-4 cm) of the two techniques.

  2. Identifying significant gene‐environment interactions using a combination of screening testing and hierarchical false discovery rate control

    PubMed Central

    Shen, Li; Saykin, Andrew J.; Williams, Scott M.; Moore, Jason H.

    2016-01-01

    ABSTRACT Although gene‐environment (G× E) interactions play an important role in many biological systems, detecting these interactions within genome‐wide data can be challenging due to the loss in statistical power incurred by multiple hypothesis correction. To address the challenge of poor power and the limitations of existing multistage methods, we recently developed a screening‐testing approach for G× E interaction detection that combines elastic net penalized regression with joint estimation to support a single omnibus test for the presence of G× E interactions. In our original work on this technique, however, we did not assess type I error control or power and evaluated the method using just a single, small bladder cancer data set. In this paper, we extend the original method in two important directions and provide a more rigorous performance evaluation. First, we introduce a hierarchical false discovery rate approach to formally assess the significance of individual G× E interactions. Second, to support the analysis of truly genome‐wide data sets, we incorporate a score statistic‐based prescreening step to reduce the number of single nucleotide polymorphisms prior to fitting the first stage penalized regression model. To assess the statistical properties of our method, we compare the type I error rate and statistical power of our approach with competing techniques using both simple simulation designs as well as designs based on real disease architectures. Finally, we demonstrate the ability of our approach to identify biologically plausible SNP‐education interactions relative to Alzheimer's disease status using genome‐wide association study data from the Alzheimer's Disease Neuroimaging Initiative (ADNI). PMID:27578615

  3. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  4. Probability density function formalism for optical coherence tomography signal analysis: a controlled phantom study.

    PubMed

    Weatherbee, Andrew; Sugita, Mitsuro; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex

    2016-06-15

    The distribution of backscattered intensities as described by the probability density function (PDF) of tissue-scattered light contains information that may be useful for tissue assessment and diagnosis, including characterization of its pathology. In this Letter, we examine the PDF description of the light scattering statistics in a well characterized tissue-like particulate medium using optical coherence tomography (OCT). It is shown that for low scatterer density, the governing statistics depart considerably from a Gaussian description and follow the K distribution for both OCT amplitude and intensity. The PDF formalism is shown to be independent of the scatterer flow conditions; this is expected from theory, and suggests robustness and motion independence of the OCT amplitude (and OCT intensity) PDF metrics in the context of potential biomedical applications.

  5. From non-trivial geometries to power spectra and vice versa

    NASA Astrophysics Data System (ADS)

    Brooker, D. J.; Tsamis, N. C.; Woodard, R. P.

    2018-04-01

    We review a recent formalism which derives the functional forms of the primordial—tensor and scalar—power spectra of scalar potential inflationary models. The formalism incorporates the case of geometries with non-constant first slow-roll parameter. Analytic expressions for the power spectra are given that explicitly display the dependence on the geometric properties of the background. Moreover, we present the full algorithm for using our formalism, to reconstruct the model from the observed power spectra. Our techniques are applied to models possessing "features" in their potential with excellent agreement.

  6. Keldysh formalism for multiple parallel worlds

    NASA Astrophysics Data System (ADS)

    Ansari, M.; Nazarov, Y. V.

    2016-03-01

    We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.

  7. Defining the IEEE-854 floating-point standard in PVS

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1995-01-01

    A significant portion of the ANSI/IEEE-854 Standard for Radix-Independent Floating-Point Arithmetic is defined in PVS (Prototype Verification System). Since IEEE-854 is a generalization of the ANSI/IEEE-754 Standard for Binary Floating-Point Arithmetic, the definition of IEEE-854 in PVS also formally defines much of IEEE-754. This collection of PVS theories provides a basis for machine checked verification of floating-point systems. This formal definition illustrates that formal specification techniques are sufficiently advanced that is is reasonable to consider their use in the development of future standards.

  8. Keldysh formalism for multiple parallel worlds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansari, M.; Nazarov, Y. V., E-mail: y.v.nazarov@tudelft.nl

    We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.

  9. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  10. Tsallis and Kaniadakis statistics from a point of view of the holographic equipartition law

    NASA Astrophysics Data System (ADS)

    Abreu, Everton M. C.; Ananias Neto, Jorge; Mendes, Albert C. R.; Bonilla, Alexander

    2018-02-01

    In this work, we have illustrated the difference between both Tsallis and Kaniadakis entropies through cosmological models obtained from the formalism proposed by Padmanabhan, which is called holographic equipartition law. Similarly to the formalism proposed by Komatsu, we have obtained an extra driving constant term in the Friedmann equation if we deform the Tsallis entropy by Kaniadakis' formalism. We have considered initially Tsallis entropy as the black-hole (BH) area entropy. This constant term may lead the universe to be in an accelerated or decelerated mode. On the other hand, if we start with the Kaniadakis entropy as the BH area entropy and then by modifying the Kappa expression by Tsallis' formalism, the same absolute value but with opposite sign is obtained. In an opposite limit, no driving inflation term of the early universe was derived from both deformations.

  11. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  12. Cost implications of organizing nursing home workforce in teams.

    PubMed

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-08-01

    To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs.

  13. Formality of the Chinese collective leadership.

    PubMed

    Li, Haiying; Graesser, Arthur C

    2016-09-01

    We investigated the linguistic patterns in the discourse of four generations of the collective leadership of the Communist Party of China (CPC) from 1921 to 2012. The texts of Mao Zedong, Deng Xiaoping, Jiang Zemin, and Hu Jintao were analyzed using computational linguistic techniques (a Chinese formality score) to explore the persuasive linguistic features of the leaders in the contexts of power phase, the nation's education level, power duration, and age. The study was guided by the elaboration likelihood model of persuasion, which includes a central route (represented by formal discourse) versus a peripheral route (represented by informal discourse) to persuasion. The results revealed that these leaders adopted the formal, central route more when they were in power than before they came into power. The nation's education level was a significant factor in the leaders' adoption of the persuasion strategy. The leaders' formality also decreased with their increasing age and in-power times. However, the predictability of these factors for formality had subtle differences among the different types of leaders. These results enhance our understanding of the Chinese collective leadership and the role of formality in politically persuasive messages.

  14. Microprocessor Simulation: A Training Technique.

    ERIC Educational Resources Information Center

    Oscarson, David J.

    1982-01-01

    Describes the design and application of a microprocessor simulation using BASIC for formal training of technicians and managers and as a management tool. Illustrates the utility of the modular approach for the instruction and practice of decision-making techniques. (SK)

  15. A survey of the sociodemographic and educational characteristics of oral health technicians in public primary health care teams in Minas Gerais, Brazil.

    PubMed

    Abreu, Mauro Henrique Nogueira Guimarães; Sanglard-Oliveira, Carla Aparecida; Jaruche, Abdul Rahman Mustafá; Mambrini, Juliana Vaz de Melo; Werneck, Marcos Azeredo Furquim; Lucas, Simone Dutra

    2013-12-23

    To describe some sociodemographic and educational characteristics of oral health technicians (OHTs) in public primary health care teams in the state of Minas Gerais, Brazil. A cross-sectional descriptive study was performed based on the telephone survey of a representative sample comprising 231 individuals. A pre-tested instrument was used for the data collection, including questions on gender, age in years, years of work as an OHT, years since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. The descriptive statistic was developed and the formation of clusters, by the agglomerative hierarchy technique based on the furthest neighbour, was based on the age, years of work as an OHT, time since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. Most interviewees (97.1%) were female. A monthly income of USD 300.00 to 600.00 was reported by 77.5% of the sample. Having educational qualifications in excess of their role was reported by approximately 20% of the participants. The median time since graduation was six years, and half of the sample had worked for four years as an OHT. Most interviewees (67.6%) reported having participated in professional continuing educational programmes. Two different clusters were identified based on the sociodemographic and educational characteristics of the sample. The Brazilian OHTs in public primary health care teams in the state of Minas Gerais are mostly female who have had little time since graduation, working experience, and formal schooling sufficient for professional practice.

  16. A survey of the sociodemographic and educational characteristics of oral health technicians in public primary health care teams in Minas Gerais, Brazil

    PubMed Central

    2013-01-01

    Background To describe some sociodemographic and educational characteristics of oral health technicians (OHTs) in public primary health care teams in the state of Minas Gerais, Brazil. Methods A cross-sectional descriptive study was performed based on the telephone survey of a representative sample comprising 231 individuals. A pre-tested instrument was used for the data collection, including questions on gender, age in years, years of work as an OHT, years since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. The descriptive statistic was developed and the formation of clusters, by the agglomerative hierarchy technique based on the furthest neighbour, was based on the age, years of work as an OHT, time since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. Results Most interviewees (97.1%) were female. A monthly income of USD 300.00 to 600.00 was reported by 77.5% of the sample. Having educational qualifications in excess of their role was reported by approximately 20% of the participants. The median time since graduation was six years, and half of the sample had worked for four years as an OHT. Most interviewees (67.6%) reported having participated in professional continuing educational programmes. Two different clusters were identified based on the sociodemographic and educational characteristics of the sample. Conclusions The Brazilian OHTs in public primary health care teams in the state of Minas Gerais are mostly female who have had little time since graduation, working experience, and formal schooling sufficient for professional practice. PMID:24365451

  17. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  18. "Of Course I'm Communicating; I Lecture Every Day": Enhancing Teaching and Learning in Introductory Statistics. Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Wulff, Shaun S.; Wulff, Donald H.

    2004-01-01

    This article focuses on one instructor's evolution from formal lecturing to interactive teaching and learning in a statistics course. Student perception data are used to demonstrate the instructor's use of communication to align the content, students, and instructor throughout the course. Results indicate that the students learned, that…

  19. Assessment of long-term impact of formal certified cardiopulmonary resuscitation training program among nurses

    PubMed Central

    Saramma, P. P.; Raj, L. Suja; Dash, P. K.; Sarma, P. S.

    2016-01-01

    Context: Cardiopulmonary resuscitation (CPR) and emergency cardiovascular care guidelines are periodically renewed and published by the American Heart Association. Formal training programs are conducted based on these guidelines. Despite widespread training CPR is often poorly performed. Hospital educators spend a significant amount of time and money in training health professionals and maintaining basic life support (BLS) and advanced cardiac life support (ACLS) skills among them. However, very little data are available in the literature highlighting the long-term impact of these training. Aims: To evaluate the impact of formal certified CPR training program on the knowledge and skill of CPR among nurses, to identify self-reported outcomes of attempted CPR and training needs of nurses. Setting and Design: Tertiary care hospital, Prospective, repeated-measures design. Subjects and Methods: A series of certified BLS and ACLS training programs were conducted during 2010 and 2011. Written and practical performance tests were done. Final testing was undertaken 3–4 years after training. The sample included all available, willing CPR certified nurses and experience matched CPR noncertified nurses. Statistical Analysis Used: SPSS for Windows version 21.0. Results: The majority of the 206 nurses (93 CPR certified and 113 noncertified) were females. There was a statistically significant increase in mean knowledge level and overall performance before and after the formal certified CPR training program (P = 0.000). However, the mean knowledge scores were equivalent among the CPR certified and noncertified nurses, although the certified nurses scored a higher mean score (P = 0.140). Conclusions: Formal certified CPR training program increases CPR knowledge and skill. However, significant long-term effects could not be found. There is a need for regular and periodic recertification. PMID:27303137

  20. A Fourier-based compressed sensing technique for accelerated CT image reconstruction using first-order methods.

    PubMed

    Choi, Kihwan; Li, Ruijiang; Nam, Haewon; Xing, Lei

    2014-06-21

    As a solution to iterative CT image reconstruction, first-order methods are prominent for the large-scale capability and the fast convergence rate [Formula: see text]. In practice, the CT system matrix with a large condition number may lead to slow convergence speed despite the theoretically promising upper bound. The aim of this study is to develop a Fourier-based scaling technique to enhance the convergence speed of first-order methods applied to CT image reconstruction. Instead of working in the projection domain, we transform the projection data and construct a data fidelity model in Fourier space. Inspired by the filtered backprojection formalism, the data are appropriately weighted in Fourier space. We formulate an optimization problem based on weighted least-squares in the Fourier space and total-variation (TV) regularization in image space for parallel-beam, fan-beam and cone-beam CT geometry. To achieve the maximum computational speed, the optimization problem is solved using a fast iterative shrinkage-thresholding algorithm with backtracking line search and GPU implementation of projection/backprojection. The performance of the proposed algorithm is demonstrated through a series of digital simulation and experimental phantom studies. The results are compared with the existing TV regularized techniques based on statistics-based weighted least-squares as well as basic algebraic reconstruction technique. The proposed Fourier-based compressed sensing (CS) method significantly improves both the image quality and the convergence rate compared to the existing CS techniques.

  1. Submental liposuction versus formal cervicoplasty: which one to choose?

    PubMed

    Fattahi, Tirbod

    2012-12-01

    Esthetic rejuvenation of the submental area is a fairly common concern of patients seeking cosmetic surgery. There are several techniques used to obtain esthetic results. A common dilemma is the proper determination as to which procedure, liposuction versus formal cervicoplasty, is more appropriate. This manuscript describes the factors involved in the aging process of the submental area, as well as the inherent advantages of formal cervicoplasty over liposuction. A comprehensive review of the intrinsic and extrinsic aging process is described, and advantages and disadvantages of liposuction as well as cervicoplasty are detailed. On the basis of the specific factors leading to the fullness of the anterior neck/submental area, proper rejuvenation technique must include platysmaplasty, in addition to liposuction. Isolated liposuction is only beneficial in an isolated group of cosmetic patients. Formal cervicoplasty, including open liposuction and platysmaplasty, is a superior operation compared with isolated liposuction of the submental area. Whereas liposuction does have a role in cosmetic surgery of the submental area, it is not a comprehensive procedure and does not address all of the anatomic components leading to submental fullness. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ni, Xiaotong; Van den Nest, Maarten; Buerschaper, Oliver

    We propose a non-commutative extension of the Pauli stabilizer formalism. The aim is to describe a class of many-body quantum states which is richer than the standard Pauli stabilizer states. In our framework, stabilizer operators are tensor products of single-qubit operators drawn from the group 〈αI, X, S〉, where α = e{sup iπ/4} and S = diag(1, i). We provide techniques to efficiently compute various properties related to bipartite entanglement, expectation values of local observables, preparation by means of quantum circuits, parent Hamiltonians, etc. We also highlight significant differences compared to the Pauli stabilizer formalism. In particular, we give examplesmore » of states in our formalism which cannot arise in the Pauli stabilizer formalism, such as topological models that support non-Abelian anyons.« less

  3. Genetic programming assisted stochastic optimization strategies for optimization of glucose to gluconic acid fermentation.

    PubMed

    Cheema, Jitender Jit Singh; Sankpal, Narendra V; Tambe, Sanjeev S; Kulkarni, Bhaskar D

    2002-01-01

    This article presents two hybrid strategies for the modeling and optimization of the glucose to gluconic acid batch bioprocess. In the hybrid approaches, first a novel artificial intelligence formalism, namely, genetic programming (GP), is used to develop a process model solely from the historic process input-output data. In the next step, the input space of the GP-based model, representing process operating conditions, is optimized using two stochastic optimization (SO) formalisms, viz., genetic algorithms (GAs) and simultaneous perturbation stochastic approximation (SPSA). These SO formalisms possess certain unique advantages over the commonly used gradient-based optimization techniques. The principal advantage of the GP-GA and GP-SPSA hybrid techniques is that process modeling and optimization can be performed exclusively from the process input-output data without invoking the detailed knowledge of the process phenomenology. The GP-GA and GP-SPSA techniques have been employed for modeling and optimization of the glucose to gluconic acid bioprocess, and the optimized process operating conditions obtained thereby have been compared with those obtained using two other hybrid modeling-optimization paradigms integrating artificial neural networks (ANNs) and GA/SPSA formalisms. Finally, the overall optimized operating conditions given by the GP-GA method, when verified experimentally resulted in a significant improvement in the gluconic acid yield. The hybrid strategies presented here are generic in nature and can be employed for modeling and optimization of a wide variety of batch and continuous bioprocesses.

  4. From Informal Safety-Critical Requirements to Property-Driven Formal Validation

    NASA Technical Reports Server (NTRS)

    Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano

    2008-01-01

    Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.

  5. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  6. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  7. Condensate statistics in interacting and ideal dilute bose gases

    PubMed

    Kocharovsky; Kocharovsky; Scully

    2000-03-13

    We obtain analytical formulas for the statistics, in particular, for the characteristic function and all cumulants, of the Bose-Einstein condensate in dilute weakly interacting and ideal equilibrium gases in the canonical ensemble via the particle-number-conserving operator formalism of Girardeau and Arnowitt. We prove that the ground-state occupation statistics is not Gaussian even in the thermodynamic limit. We calculate the effect of Bogoliubov coupling on suppression of ground-state occupation fluctuations and show that they are governed by a pair-correlation, squeezing mechanism.

  8. Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.

    2006-01-01

    NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.

  9. Chemodetection in fluctuating environments: receptor coupling, buffering, and antagonism.

    PubMed

    Lalanne, Jean-Benoît; François, Paul

    2015-02-10

    Variability in the chemical composition of the extracellular environment can significantly degrade the ability of cells to detect rare cognate ligands. Using concepts from statistical detection theory, we formalize the generic problem of detection of small concentrations of ligands in a fluctuating background of biochemically similar ligands binding to the same receptors. We discover that in contrast with expectations arising from considerations of signal amplification, inhibitory interactions between receptors can improve detection performance in the presence of substantial environmental variability, providing an adaptive interpretation to the phenomenon of ligand antagonism. Our results suggest that the structure of signaling pathways responsible for chemodetection in fluctuating and heterogeneous environments might be optimized with respect to the statistics and dynamics of environmental composition. The developed formalism stresses the importance of characterizing nonspecific interactions to understand function in signaling pathways.

  10. Comparing perceived self-management practices of adult type 2 diabetic patients after completion of a structured ADA certified diabetes self-management education program with unstructured individualized nurse practitioner led diabetes self-management education.

    PubMed

    Wooley, Dennis S; Kinner, Tracy J

    2016-11-01

    The purpose was to compare perceived self-management practices of adult type 2 diabetic patients after completing an American Diabetes Association (ADA) certified diabetes self-management education (DSME) program with unstructured individualized nurse practitioner led DSME. Demographic questions and the Self-Care Inventory-Revised (SCIR) were given to two convenience sample patient groups comprising a formal DSME program group and a group within a clinical setting who received informal and unstructured individual education during patient encounters. A t-test was executed between the formal ADA certified education sample and the informal sample's SCI-R individual scores. A second t-test was performed between the two samples' SCI-R mean scores. A t-test determined no statistically significant difference between the formal ADA structured education and informal education samples' SCI-R individual scores. There was not a statistically significant difference between the samples' SCI-R mean scores. The study results suggest that there are not superior DSME settings and instructional approaches. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Formal semantic specifications as implementation blueprints for real-time programming languages

    NASA Technical Reports Server (NTRS)

    Feyock, S.

    1981-01-01

    Formal definitions of language and system semantics provide highly desirable checks on the correctness of implementations of programming languages and their runtime support systems. If these definitions can give concrete guidance to the implementor, major increases in implementation accuracy and decreases in implementation effort can be achieved. It is shown that of the wide variety of available methods the Hgraph (hypergraph) definitional technique (Pratt, 1975), is best suited to serve as such an implementation blueprint. A discussion and example of the Hgraph technique is presented, as well as an overview of the growing body of implementation experience of real-time languages based on Hgraph semantic definitions.

  12. Formal optimization of hovering performance using free wake lifting surface theory

    NASA Technical Reports Server (NTRS)

    Chung, S. Y.

    1986-01-01

    Free wake techniques for performance prediction and optimization of hovering rotor are discussed. The influence functions due to vortex ring, vortex cylinder, and source or vortex sheets are presented. The vortex core sizes of rotor wake vortices are calculated and their importance is discussed. Lifting body theory for finite thickness body is developed for pressure calculation, and hence performance prediction of hovering rotors. Numerical optimization technique based on free wake lifting line theory is presented and discussed. It is demonstrated that formal optimization can be used with the implicit and nonlinear objective or cost function such as the performance of hovering rotors as used in this report.

  13. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  14. Statistical inference for tumor growth inhibition T/C ratio.

    PubMed

    Wu, Jianrong

    2010-09-01

    The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.

  15. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  16. 11.2 YIP Human In the Loop Statistical RelationalLearners

    DTIC Science & Technology

    2017-10-23

    learning formalisms including inverse reinforcement learning [4] and statistical relational learning [7, 5, 8]. We have also applied our algorithms in...one introduced for label preferences. 4 Figure 2: Active Advice Seeking for Inverse Reinforcement Learning. active advice seeking is in selecting the...learning tasks. 1.2.1 Sequential Decision-Making Our previous work on advice for inverse reinforcement learning (IRL) defined advice as action

  17. Functional equivalency inferred from "authoritative sources" in networks of homologous proteins.

    PubMed

    Natarajan, Shreedhar; Jakobsson, Eric

    2009-06-12

    A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods.

  18. Spatial correlations in driven-dissipative photonic lattices

    NASA Astrophysics Data System (ADS)

    Biondi, Matteo; Lienhard, Saskia; Blatter, Gianni; Türeci, Hakan E.; Schmidt, Sebastian

    2017-12-01

    We study the nonequilibrium steady-state of interacting photons in cavity arrays as described by the driven-dissipative Bose–Hubbard and spin-1/2 XY model. For this purpose, we develop a self-consistent expansion in the inverse coordination number of the array (∼ 1/z) to solve the Lindblad master equation of these systems beyond the mean-field approximation. Our formalism is compared and benchmarked with exact numerical methods for small systems based on an exact diagonalization of the Liouvillian and a recently developed corner-space renormalization technique. We then apply this method to obtain insights beyond mean-field in two particular settings: (i) we show that the gas–liquid transition in the driven-dissipative Bose–Hubbard model is characterized by large density fluctuations and bunched photon statistics. (ii) We study the antibunching–bunching transition of the nearest-neighbor correlator in the driven-dissipative spin-1/2 XY model and provide a simple explanation of this phenomenon.

  19. Precise interferometric tracking of the DSCS II geosynchronous orbiter

    NASA Astrophysics Data System (ADS)

    Border, J. S.; Donivan, F. F., Jr.; Shiomi, T.; Kawano, N.

    1986-01-01

    A demonstration of the precise tracking of a geosynchronous orbiter by radio metric techniques based on very-long-baseline interferometry (VLBI) has been jointly conducted by the Jet Propulsion Laboratory and Japan's Radio Research Laboratory. Simultaneous observations of a U.S. Air Force communications satellite from tracking stations in California, Australia, and Japan have determined the satellite's position with an accuracy of a few meters. Accuracy claims are based on formal statistics, which include the effects of errors in non-estimated parameters and which are supported by a chi-squared of less than one, and on the consistency of orbit solutions from disjoint data sets. A study made to assess the impact of shorter baselines and reduced data noise concludes that with a properly designed system, similar accuracy could be obtained for either a satellite viewed from stations located within the continental U.S. or for a satellite viewed from stations within Japanese territory.

  20. Functional Equivalency Inferred from “Authoritative Sources” in Networks of Homologous Proteins

    PubMed Central

    Natarajan, Shreedhar; Jakobsson, Eric

    2009-01-01

    A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods. PMID:19521530

  1. The dose-response of time served in prison on mortality: New York State, 1989-2003.

    PubMed

    Patterson, Evelyn J

    2013-03-01

    I investigated the differential impact of the dose-response of length of stay on postprison mortality among parolees. Using 1989-2003 New York State parole administrative data from the Bureau of Justice Statistics on state correctional facilities, I employed multinomial logistic regression analyses and formal demographic techniques that used the life table of the populations to deduce changes in life expectancy. Each additional year in prison produced a 15.6% increase in the odds of death for parolees, which translated to a 2-year decline in life expectancy for each year served in prison. The risk was highest upon release from prison and declined over time. The time to recovery, or the lowest risk level, was approximately two thirds of the time served in prison. Incarceration reduces life span. Future research should investigate the pathways to this higher mortality and the possibilities of recovery.

  2. The Role of Non-Formal Education in Combating the HIV Epidemic in the Philippines and Taiwan

    ERIC Educational Resources Information Center

    Morisky, Donald E.; Lyu, Shu-Yu; Urada, Lianne A.

    2009-01-01

    The Philippines is experiencing a low but slowly growing prevalence of HIV, with a UN estimate of 6,000-11,000 cases out of a population of 91 million, and a 150% increase in new cases in 2008 compared to previous years. Earlier education programmes employed non-formal educational training techniques in the southern Philippines to target high-risk…

  3. Causality and headache triggers

    PubMed Central

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  4. Analytical group decision making in natural resources: Methodology and application

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  5. Biased Tracers in Redshift Space in the EFT of Large-Scale Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perko, Ashley; Senatore, Leonardo; Jennings, Elise

    2016-10-28

    The Effective Field Theory of Large-Scale Structure (EFTofLSS) provides a novel formalism that is able to accurately predict the clustering of large-scale structure (LSS) in the mildly non-linear regime. Here we provide the first computation of the power spectrum of biased tracers in redshift space at one loop order, and we make the associated code publicly available. We compare the multipolesmore » $$\\ell=0,2$$ of the redshift-space halo power spectrum, together with the real-space matter and halo power spectra, with data from numerical simulations at $z=0.67$. For the samples we compare to, which have a number density of $$\\bar n=3.8 \\cdot 10^{-2}(h \\ {\\rm Mpc}^{-1})^3$$ and $$\\bar n=3.9 \\cdot 10^{-4}(h \\ {\\rm Mpc}^{-1})^3$$, we find that the calculation at one-loop order matches numerical measurements to within a few percent up to $$k\\simeq 0.43 \\ h \\ {\\rm Mpc}^{-1}$$, a significant improvement with respect to former techniques. By performing the so-called IR-resummation, we find that the Baryon Acoustic Oscillation peak is accurately reproduced. Based on the results presented here, long-wavelength statistics that are routinely observed in LSS surveys can be finally computed in the EFTofLSS. This formalism thus is ready to start to be compared directly to observational data.« less

  6. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  7. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  8. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  9. Cost Implications of Organizing Nursing Home Workforce in Teams

    PubMed Central

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-01-01

    Objective To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Data Sources/Study Setting Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. Study Design A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Data Collection Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Principal Findings Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Conclusions Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs. PMID:19486181

  10. Thomas-Fermi model for a bulk self-gravitating stellar object in two dimensions

    NASA Astrophysics Data System (ADS)

    De, Sanchari; Chakrabarty, Somenath

    2015-09-01

    In this article we have solved a hypothetical problem related to the stability and gross properties of two-dimensional self-gravitating stellar objects using the Thomas-Fermi model. The formalism presented here is an extension of the standard three-dimensional problem discussed in the book on statistical physics, Part I by Landau and Lifshitz. Further, the formalism presented in this article may be considered a class problem for post-graduate-level students of physics or may be assigned as a part of their dissertation project.

  11. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  12. Unified quantitative characterization of epithelial tissue development

    PubMed Central

    Guirao, Boris; Rigaud, Stéphane U; Bosveld, Floris; Bailles, Anaïs; López-Gay, Jesús; Ishihara, Shuji; Sugimura, Kaoru

    2015-01-01

    Understanding the mechanisms regulating development requires a quantitative characterization of cell divisions, rearrangements, cell size and shape changes, and apoptoses. We developed a multiscale formalism that relates the characterizations of each cell process to tissue growth and morphogenesis. Having validated the formalism on computer simulations, we quantified separately all morphogenetic events in the Drosophila dorsal thorax and wing pupal epithelia to obtain comprehensive statistical maps linking cell and tissue scale dynamics. While globally cell shape changes, rearrangements and divisions all significantly participate in tissue morphogenesis, locally, their relative participations display major variations in space and time. By blocking division we analyzed the impact of division on rearrangements, cell shape changes and tissue morphogenesis. Finally, by combining the formalism with mechanical stress measurement, we evidenced unexpected interplays between patterns of tissue elongation, cell division and stress. Our formalism provides a novel and rigorous approach to uncover mechanisms governing tissue development. DOI: http://dx.doi.org/10.7554/eLife.08519.001 PMID:26653285

  13. Quantum-like model for the adaptive dynamics of the genetic regulation of E. coli's metabolism of glucose/lactose.

    PubMed

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2012-06-01

    We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.

  14. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  15. Gulf Coast Clean Energy Application Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillingham, Gavin

    The Gulf Coast Clean Energy Application Center was initiated to significantly improve market and regulatory conditions for the implementation of combined heat and power technologies. The GC CEAC was responsible for the development of CHP in Texas, Louisiana and Oklahoma. Through this program we employed a variety of outreach and education techniques, developed and deployed assessment tools and conducted market assessments. These efforts resulted in the growth of the combined heat and power market in the Gulf Coast region with a realization of more efficient energy generation, reduced emissions and a more resilient infrastructure. Specific t research, we did notmore » formally investigate any techniques with any formal research design or methodology.« less

  16. Equitability, mutual information, and the maximal information coefficient.

    PubMed

    Kinney, Justin B; Atwal, Gurinder S

    2014-03-04

    How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical "equitability" has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a self-consistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518-1524], which proposed an alternative definition of equitability and introduced a new statistic, the "maximal information coefficient" (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.

  17. Student Incivility in Radiography Education.

    PubMed

    Clark, Kevin R

    2017-07-01

    To examine student incivility in radiography classrooms by exploring the prevalence of uncivil behaviors along with the classroom management strategies educators use to manage and prevent classroom disruptions. A survey was designed to collect data on the severity and frequency of uncivil student behaviors, classroom management strategies used to address minor and major behavioral issues, and techniques to prevent student incivility. The participants were educators in radiography programs accredited by the Joint Review Committee on Education in Radiologic Technology. Findings indicated that severe uncivil student behaviors in radiography classrooms do not occur as often as behaviors classified as less severe. Radiography educators in this study used a variety of strategies and techniques to manage and prevent student incivility; however, radiography educators who received formal training in classroom management reported fewer incidents of student incivility than those who had not received formal training. The participants in this study took a proactive approach to addressing severe behavioral issues in the classroom. Many radiography educators transition from the clinical environment to the classroom setting with little to no formal training in classroom management. Radiography educators are encouraged to attend formal training sessions to learn how to manage the higher education classroom effectively. Student incivility is present in radiography classrooms. This study provides a foundation for future research on incivility. ©2017 American Society of Radiologic Technologists.

  18. Formal Requirements-Based Programming for Complex Systems

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis

    2005-01-01

    Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.

  19. Gluon Bremsstrahlung in Weakly-Coupled Plasmas

    NASA Astrophysics Data System (ADS)

    Arnold, Peter

    2009-11-01

    I report on some theoretical progress concerning the calculation of gluon bremsstrahlung for very high energy particles crossing a weakly-coupled quark-gluon plasma. (i) I advertise that two of the several formalisms used to study this problem, the BDMPS-Zakharov formalism and the AMY formalism (the latter used only for infinite, uniform media), can be made equivalent when appropriately formulated. (ii) A standard technique to simplify calculations is to expand in inverse powers of logarithms ln(E/T). I give an example where such expansions are found to work well for ω/T≳10 where ω is the bremsstrahlung gluon energy. (iii) Finally, I report on perturbative calculations of q̂.

  20. Used battery collection in central Mexico: metal content, legislative/management situation and statistical analysis.

    PubMed

    Guevara-García, José Antonio; Montiel-Corona, Virginia

    2012-03-01

    A statistical analysis of a used battery collection campaign in the state of Tlaxcala, Mexico, is presented. This included a study of the metal composition of spent batteries from formal and informal markets, and a critical discussion about the management of spent batteries in Mexico with respect to legislation. A six-month collection campaign was statistically analyzed: 77% of the battery types were "AA" and 30% of the batteries were from the informal market. A substantial percentage (36%) of batteries had residual voltage in the range 1.2-1.4 V, and 70% had more than 1.0 V; this may reflect underutilization. Metal content analysis and recovery experiments were performed with the five formal and four more frequent informal trademarks. The analysis of Hg, Cd and Pb showed there is no significant difference in content between formal and informal commercialized batteries. All of the analyzed trademarks were under the permissible limit levels of the proposed Mexican Official Norm (NOM) NMX-AA-104-SCFI-2006 and would be classified as not dangerous residues (can be thrown to the domestic rubbish); however, compared with the EU directive 2006/66/EC, 8 out of 9 of the selected battery trademarks would be rejected, since the Mexican Norm content limit is 20, 7.5 and 5 fold higher in Hg, Cd and Pb, respectively, than the EU directive. These results outline the necessity for better regulatory criteria in the proposed Mexican NOM in order to minimize the impact on human health and the environment of this type of residues. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Quantum Statistics of the Toda Oscillator in the Wigner Function Formalism

    NASA Astrophysics Data System (ADS)

    Vojta, Günter; Vojta, Matthias

    Classical and quantum mechanical Toda systems (Toda molecules, Toda lattices, Toda quantum fields) recently found growing interest as nonlinear systems showing solitons and chaos. In this paper the statistical thermodynamics of a system of quantum mechanical Toda oscillators characterized by a potential energy V(q) = Vo cos h q is treated within the Wigner function formalism (phase space formalism of quantum statistics). The partition function is given as a Wigner- Kirkwood series expansion in terms of powers of h2 (semiclassical expansion). The partition function and all thermodynamic functions are written, with considerable exactness, as simple closed expressions containing only the modified Hankel functions Ko and K1 of the purely imaginary argument i with = Vo/kT.Translated AbstractQuantenstatistik des Toda-Oszillators im Formalismus der Wigner-FunktionKlassische und quantenmechanische Toda-Systeme (Toda-Moleküle, Toda-Gitter, Toda-Quantenfelder) haben als nichtlineare Systeme mit Solitonen und Chaos in jüngster Zeit zunehmend an Interesse gewonnen. Wir untersuchen die statistische Thermodynamik eines Systems quantenmechanischer Toda-Oszillatoren, die durch eine potentielle Energie der Form V(q) = Vo cos h q charakterisiert sind, im Formalismus der Wigner-Funktion (Phasenraum-Formalismus der Quantenstatistik). Die Zustandssumme wird als Wigner-Kirkwood-Reihe nach Potenzen von h2 (semiklassische Entwicklung) dargestellt, und aus ihr werden die thermodynamischen Funktionen berechnet. Sämtliche Funktionen sind durch einfache geschlossene Formeln allein mit den modifizierten Hankel-Funktionen Ko und K1 des rein imaginären Arguments i mit = Vo/kT mit großer Genauigkeit darzustellen.

  2. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  3. The Statistical Mechanics of Solar Wind Hydroxylation at the Moon, Within Lunar Magnetic Anomalies, and at Phobos

    NASA Technical Reports Server (NTRS)

    Farrell, W. M.; Hurley, D. M.; Esposito, V. J.; Mclain, J. L.; Zimmerman, M. I.

    2017-01-01

    We present a new formalism to describe the outgassing of hydrogen initially implanted by the solar wind protons into exposed soils on airless bodies. The formalism applies a statistical mechanics approach similar to that applied recently to molecular adsorption onto activated surfaces. The key element enabling this formalism is the recognition that the interatomic potential between the implanted H and regolith-residing oxides is not of singular value but possess a distribution of trapped energy values at a given temperature, F(U,T). All subsequent derivations of the outward diffusion and H retention rely on the specific properties of this distribution. We find that solar wind hydrogen can be retained if there are sites in the implantation layer with activation energy values exceeding 0.5eV. We especially examine the dependence of H retention applying characteristic energy values found previously for irradiated silica and mature lunar samples. We also apply the formalism to two cases that differ from the typical solar wind implantation at the Moon. First, we test for a case of implantation in magnetic anomaly regions where significantly lower-energy ions of solar wind origin are expected to be incident with the surface. In magnetic anomalies, H retention is found to be reduced due to the reduced ion flux and shallower depth of implantation. Second, we also apply the model to Phobos where the surface temperature range is not as extreme as the Moon. We find the H atom retention in this second case is higher than the lunar case due to the reduced thermal extremes (that reduces outgassing).

  4. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  5. The trephine colostomy: a permanent left iliac fossa end colostomy without recourse to laparotomy.

    PubMed Central

    Senapati, A.; Phillips, R. K.

    1991-01-01

    An operative technique for performing a permanent end sigmoid colostomy without recourse to laparotomy is presented. The results from 16 patients have shown a very low morbidity. The technique was unsuccessful in three patients, each needing a formal laparotomy. PMID:1929133

  6. Simulation Techniques in Training College Administrators.

    ERIC Educational Resources Information Center

    Fincher, Cameron

    Traditional methods of recruitment and selection in academic administration have not placed an emphasis on formal training or preparation but have relied heavily on informal notions of experiential learning. Simulation as a device for representing complex processes in a manageable form, gaming as an organizing technique for training and…

  7. Rewriting Modulo SMT and Open System Analysis

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar

    2014-01-01

    This paper proposes rewriting modulo SMT, a new technique that combines the power of SMT solving, rewriting modulo theories, and model checking. Rewriting modulo SMT is ideally suited to model and analyze infinite-state open systems, i.e., systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism, which is proper to the system, and external non-determinism, which is due to the environment. In a reflective formalism, such as rewriting logic, rewriting modulo SMT can be reduced to standard rewriting. Hence, rewriting modulo SMT naturally extends rewriting-based reachability analysis techniques, which are available for closed systems, to open systems. The proposed technique is illustrated with the formal analysis of: (i) a real-time system that is beyond the scope of timed-automata methods and (ii) automatic detection of reachability violations in a synchronous language developed to support autonomous spacecraft operations.

  8. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  9. What Influences Mental Illness? Discrepancies Between Medical Education and Conception.

    PubMed

    Einstein, Evan Hy; Klepacz, Lidia

    2017-01-01

    This preliminary study examined the differences between what was taught during a formal medical education and medical students' and psychiatry residents' conceptions of notions regarding the causes and determinants of mental illness. The authors surveyed 74 medical students and 11 residents via convenience sampling. The survey contained 18 statements which were rated twice based on truthfulness in terms of a participant's formal education and conception, respectively. Descriptive statistics and a Wilcoxon signed rank test determined differences between education and conception. Results showed that students were less likely to perceive a neurotransmitter imbalance to cause mental illness, as opposed to what was emphasized during a formal medical education. Students and residents also understood the importance of factors such as systemic racism and socioeconomic status in the development of mental illness, which were factors that did not receive heavy emphasis during medical education. Furthermore, students and residents believed that not only did mental illnesses have nonuniform pathologies, but that the Diagnostic and Statistical Manual of Mental Disorders also had the propensity to sometimes arbitrarily categorize individuals with potentially negative consequences. If these notions are therefore part of students' and residents' conceptions, as well as documented in the literature, then it seems appropriate for medical education to be further developed to emphasize these ideas.

  10. Towards a formal semantics for Ada 9X

    NASA Technical Reports Server (NTRS)

    Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark

    1995-01-01

    The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.

  11. Toward a Formal Evaluation of Refactorings

    NASA Technical Reports Server (NTRS)

    Paul, John; Kuzmina, Nadya; Gamboa, Ruben; Caldwell, James

    2008-01-01

    Refactoring is a software development strategy that characteristically alters the syntactic structure of a program without changing its external behavior [2]. In this talk we present a methodology for extracting formal models from programs in order to evaluate how incremental refactorings affect the verifiability of their structural specifications. We envision that this same technique may be applicable to other types of properties such as those that concern the design and maintenance of safety-critical systems.

  12. (abstract) Formal Inspection Technology Transfer Program

    NASA Technical Reports Server (NTRS)

    Welz, Linda A.; Kelly, John C.

    1993-01-01

    A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.

  13. Neuroimaging Week: A Novel, Engaging, and Effective Curriculum for Teaching Neuroimaging to Junior Psychiatric Residents

    ERIC Educational Resources Information Center

    Downar, Jonathan; Krizova, Adriana; Ghaffar, Omar; Zaretsky, Ari

    2010-01-01

    Objective: Neuroimaging techniques are increasingly important in psychiatric research and clinical practice, but few postgraduate psychiatry programs offer formal training in neuroimaging. To address this need, the authors developed a course to prepare psychiatric residents to use neuroimaging techniques effectively in independent practice.…

  14. INTERLABORATORY STUDY OF THE COLD VAPOR TECHNIQUE FOR TOTAL MERCURY IN WATER

    EPA Science Inventory

    The American Society for Testing and Materials (ASTM) and the U.S. Environmental Protection Agency (EPA) conducted a joint study of the cold vapor technique for total mercury in water, before formal acceptance of the method by each organization. The method employs an acid-permang...

  15. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  16. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  17. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  18. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  19. A Bayesian analysis of redshifted 21-cm H I signal and foregrounds: simulations for LOFAR

    NASA Astrophysics Data System (ADS)

    Ghosh, Abhik; Koopmans, Léon V. E.; Chapman, E.; Jelić, V.

    2015-09-01

    Observations of the epoch of reionization (EoR) using the 21-cm hyperfine emission of neutral hydrogen (H I) promise to open an entirely new window on the formation of the first stars, galaxies and accreting black holes. In order to characterize the weak 21-cm signal, we need to develop imaging techniques that can reconstruct the extended emission very precisely. Here, we present an inversion technique for LOw Frequency ARray (LOFAR) baselines at the North Celestial Pole (NCP), based on a Bayesian formalism with optimal spatial regularization, which is used to reconstruct the diffuse foreground map directly from the simulated visibility data. We notice that the spatial regularization de-noises the images to a large extent, allowing one to recover the 21-cm power spectrum over a considerable k⊥-k∥ space in the range 0.03 Mpc-1 < k⊥ < 0.19 Mpc-1 and 0.14 Mpc-1 < k∥ < 0.35 Mpc-1 without subtracting the noise power spectrum. We find that, in combination with using generalized morphological component analysis (GMCA), a non-parametric foreground removal technique, we can mostly recover the spherical average power spectrum within 2σ statistical fluctuations for an input Gaussian random root-mean-square noise level of 60 mK in the maps after 600 h of integration over a 10-MHz bandwidth.

  20. Towards representing human behavior and decision making in Earth system models - an overview of techniques and approaches

    NASA Astrophysics Data System (ADS)

    Müller-Hansen, Finn; Schlüter, Maja; Mäs, Michael; Donges, Jonathan F.; Kolb, Jakob J.; Thonicke, Kirsten; Heitzig, Jobst

    2017-11-01

    Today, humans have a critical impact on the Earth system and vice versa, which can generate complex feedback processes between social and ecological dynamics. Integrating human behavior into formal Earth system models (ESMs), however, requires crucial modeling assumptions about actors and their goals, behavioral options, and decision rules, as well as modeling decisions regarding human social interactions and the aggregation of individuals' behavior. Here, we review existing modeling approaches and techniques from various disciplines and schools of thought dealing with human behavior at different levels of decision making. We demonstrate modelers' often vast degrees of freedom but also seek to make modelers aware of the often crucial consequences of seemingly innocent modeling assumptions. After discussing which socioeconomic units are potentially important for ESMs, we compare models of individual decision making that correspond to alternative behavioral theories and that make diverse modeling assumptions about individuals' preferences, beliefs, decision rules, and foresight. We review approaches to model social interaction, covering game theoretic frameworks, models of social influence, and network models. Finally, we discuss approaches to studying how the behavior of individuals, groups, and organizations can aggregate to complex collective phenomena, discussing agent-based, statistical, and representative-agent modeling and economic macro-dynamics. We illustrate the main ingredients of modeling techniques with examples from land-use dynamics as one of the main drivers of environmental change bridging local to global scales.

  1. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  2. Differential gene expression detection and sample classification using penalized linear regression models.

    PubMed

    Wu, Baolin

    2006-02-15

    Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.

  3. Immersive Theater - a Proven Way to Enhance Learning Retention

    NASA Astrophysics Data System (ADS)

    Reiff, P. H.; Zimmerman, L.; Spillane, S.; Sumners, C.

    2014-12-01

    The portable immersive theater has gone from our first demonstration at fall AGU 2003 to a product offered by multiple companies in various versions to literally millions of users per year. As part of our NASA funded outreach program, we conducted a test of learning in a portable Discovery Dome as contrasted with learning the same materials (visuals and sound track) on a computer screen. We tested 200 middle school students (primarily underserved minorities). Paired t-tests and an independent t-test were used to compare the amount of learning that students achieved. Interest questionnaires were administered to participants in formal (public school) settings and focus groups were conducted in informal (museum camp and educational festival) settings. Overall results from the informal and formal educational setting indicated that there was a statistically significant increase in test scores after viewing We Choose Space. There was a statistically significant increase in test scores for students who viewed We Choose Space in the portable Discovery Dome (9.75) as well as with the computer (8.88). However, long-term retention of the material tested on the questionnaire indicated that for students who watched We Choose Space in the portable Discovery Dome, there was a statistically significant long-term increase in test scores (10.47), whereas, six weeks after learning on the computer, the improvements over the initial baseline (3.49) were far less and were not statistically significant. The test score improvement six weeks after learning in the dome was essentially the same as the post test immediately after watching the show, demonstrating virtually no loss of gained information in the six week interval. In the formal educational setting, approximately 34% of the respondents indicated that they wanted to learn more about becoming a scientist, while 35% expressed an interest in a career in space science. In the informal setting, 26% indicated that they were interested in pursuing a career in space science.

  4. Superstatistics of the Klein-Gordon equation in deformed formalism for modified Dirac delta distribution

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-04-01

    The Klein-Gordon equation is extended in the presence of an Aharonov-Bohm magnetic field for the Cornell potential and the corresponding wave functions as well as the spectra are obtained. After introducing the superstatistics in the statistical mechanics, we first derived the effective Boltzmann factor in the deformed formalism with modified Dirac delta distribution. We then use the concepts of the superstatistics to calculate the thermodynamics properties of the system. The well-known results are recovered by the vanishing of deformation parameter and some graphs are plotted for the clarity of our results.

  5. Investigating Actuation Force Fight with Asynchronous and Synchronous Redundancy Management Techniques

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno

    2013-01-01

    Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.

  6. Teaching Astronomy in non-formal education: stars workshop

    NASA Astrophysics Data System (ADS)

    Hernán-Obispo, M.; Crespo-Chacón, I.; Gálvez, M. C.; López-Santiago, J.

    One of the fields in which teaching Astronomy is more demanded is non-formal education. The Stars Workshop we present in this contribution consisted on an introduction to Astronomy and observation methods. The main objectives were: to know the main components of the Universe, their characteristics and the scales of size and time existing between them; to understand the movement of the different celestial objects; to know the different observational techniques; to value the different historical explanations about the Earth and the position of Humanity in the Universe. This Stars Workshop was a collaboration with the Escuela de Tiempo Libre Jumavi, which is a school dedicated to the training and non-formal education in the leisure field.

  7. Espacial.com : a cooperative learning model in internet

    NASA Astrophysics Data System (ADS)

    Perez-Poch, A.; Solans, R.

    Espacial.com is the leading and oldest internet site in Spanish language which reports 24 hours a day on space exploration. Moreover it is the only specialized site that has broadcasted live the main space events in the past years with expert comments in Spanish . From its first day, education is the main purpose of the site always with an international and multidisciplinary approach. Fernando Caldeiro, Class 16 NASA Astronaut, is the leading person in the project with his non-stop presence in the forums making valuable comments and answering questions from its young audience. We analyse the ongoing dynamics in the forum, and how a virtual community of space enthusiasts is created. We show that, because of the presence of some key factors (leadership, commitment to excel, motivation, communicative skills, ldots), it is possible to establish a high degree of compromise for learning although in an non-formal way. Cooperative learning is a well-known pedagogical technique which has proven its efficacy in different formal and non-formal areas. Using internet capabilities this technique proves to be an excellent approach to educational outreach on space-related subjects.

  8. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  9. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  10. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  11. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  12. Twenty-five years of maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  13. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  14. Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.

    2004-01-01

    An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).

  15. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  16. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach

    DTIC Science & Technology

    2012-08-01

    fusion. Therefore, we provide a detailed discussion on uncertain data types, their origins and three uncertainty pro- cessing formalisms that are popular...suitable membership functions corresponding to the fuzzy sets. 3.2.3 DS Theory The DS belief theory, originally proposed by Dempster, can be thought of as... originated and various imperfections of the source. Uncertainty handling formalisms provide techniques for modeling and working with these uncertain data types

  17. Quality assurance software inspections at NASA Ames: Metrics for feedback and modification

    NASA Technical Reports Server (NTRS)

    Wenneson, G.

    1985-01-01

    Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.

  18. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    NASA Astrophysics Data System (ADS)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  19. The Influence of 16-year-old Students' Gender, Mental Abilities, and Motivation on their Reading and Drawing Submicrorepresentations Achievements

    NASA Astrophysics Data System (ADS)

    Devetak, Iztok; Aleksij Glažar, Saša

    2010-08-01

    Submicrorepresentations (SMRs) are a powerful tool for identifying misconceptions of chemical concepts and for generating proper mental models of chemical phenomena in students' long-term memory during chemical education. The main purpose of the study was to determine which independent variables (gender, formal reasoning abilities, visualization abilities, and intrinsic motivation for learning chemistry) have the maximum influence on students' reading and drawing SMRs. A total of 386 secondary school students (aged 16.3 years) participated in the study. The instruments used in the study were: test of Chemical Knowledge, Test of Logical Thinking, two tests of visualization abilities Patterns and Rotations, and questionnaire on Intrinsic Motivation for Learning Science. The results show moderate, but statistically significant correlations between students' intrinsic motivation, formal reasoning abilities and chemical knowledge at submicroscopic level based on reading and drawing SMRs. Visualization abilities are not statistically significantly correlated with students' success on items that comprise reading or drawing SMRs. It can be also concluded that there is a statistically significant difference between male and female students in solving problems that include reading or drawing SMRs. Based on these statistical results and content analysis of the sample problems, several educational strategies can be implemented for students to develop adequate mental models of chemical concepts on all three levels of representations.

  20. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA

    NASA Astrophysics Data System (ADS)

    Coughlan, Michael R.

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  1. Information gains from cosmic microwave background experiments

    NASA Astrophysics Data System (ADS)

    Seehars, Sebastian; Amara, Adam; Refregier, Alexandre; Paranjape, Aseem; Akeret, Joël

    2014-07-01

    To shed light on the fundamental problems posed by dark energy and dark matter, a large number of experiments have been performed and combined to constrain cosmological models. We propose a novel way of quantifying the information gained by updates on the parameter constraints from a series of experiments which can either complement earlier measurements or replace them. For this purpose, we use the Kullback-Leibler divergence or relative entropy from information theory to measure differences in the posterior distributions in model parameter space from a pair of experiments. We apply this formalism to a historical series of cosmic microwave background experiments ranging from Boomerang to WMAP, SPT, and Planck. Considering different combinations of these experiments, we thus estimate the information gain in units of bits and distinguish contributions from the reduction of statistical errors and the "surprise" corresponding to a significant shift of the parameters' central values. For this experiment series, we find individual relative entropy gains ranging from about 1 to 30 bits. In some cases, e.g. when comparing WMAP and Planck results, we find that the gains are dominated by the surprise rather than by improvements in statistical precision. We discuss how this technique provides a useful tool for both quantifying the constraining power of data from cosmological probes and detecting the tensions between experiments.

  2. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA.

    PubMed

    Coughlan, Michael R

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  3. Does History Repeat Itself? Wavelets and the Phylodynamics of Influenza A

    PubMed Central

    Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.

    2012-01-01

    Unprecedented global surveillance of viruses will result in massive sequence data sets that require new statistical methods. These data sets press the limits of Bayesian phylogenetics as the high-dimensional parameters that comprise a phylogenetic tree increase the already sizable computational burden of these techniques. This burden often results in partitioning the data set, for example, by gene, and inferring the evolutionary dynamics of each partition independently, a compromise that results in stratified analyses that depend only on data within a given partition. However, parameter estimates inferred from these stratified models are likely strongly correlated, considering they rely on data from a single data set. To overcome this shortfall, we exploit the existing Monte Carlo realizations from stratified Bayesian analyses to efficiently estimate a nonparametric hierarchical wavelet-based model and learn about the time-varying parameters of effective population size that reflect levels of genetic diversity across all partitions simultaneously. Our methods are applied to complete genome influenza A sequences that span 13 years. We find that broad peaks and trends, as opposed to seasonal spikes, in the effective population size history distinguish individual segments from the complete genome. We also address hypotheses regarding intersegment dynamics within a formal statistical framework that accounts for correlation between segment-specific parameters. PMID:22160768

  4. The predictive ability of the CHADS2 and CHA2DS2-VASc scores for bleeding risk in atrial fibrillation: the MAQI(2) experience.

    PubMed

    Barnes, Geoffrey D; Gu, Xiaokui; Haymart, Brian; Kline-Rogers, Eva; Almany, Steve; Kozlowski, Jay; Besley, Dennis; Krol, Gregory D; Froehlich, James B; Kaatz, Scott

    2014-08-01

    Guidelines recommend the assessment of stroke and bleeding risk before initiating warfarin anticoagulation in patients with atrial fibrillation. Many of the elements used to predict stroke also overlap with bleeding risk in atrial fibrillation patients and it is tempting to use stroke risk scores to efficiently estimate bleeding risk. Comparison of stroke risk scores to bleeding risk scores to predict bleeding has not been thoroughly assessed. 2600 patients followed at seven anticoagulation clinics were followed from October 2009-May 2013. Five risk models (CHADS2, CHA2DS2-VASc, HEMORR2HAGES, HAS-BLED and ATRIA) were retrospectively applied to each patient. The primary outcome was the first major bleeding event. Area under the ROC curves were compared with C statistic and net reclassification improvement (NRI) analysis was performed. 110 patients experienced a major bleeding event in 2581.6 patient-years (4.5%/year). Mean follow up was 1.0±0.8years. All of the formal bleeding risk scores had a modest predictive value for first major bleeding events (C statistic 0.66-0.69), performing better than CHADS2 and CHA2DS2-VASc scores (C statistic difference 0.10 - 0.16). NRI analysis demonstrated a 52-69% and 47-64% improvement of the formal bleeding risk scores over the CHADS2 score and CHA2DS2-VASc score, respectively. The CHADS2 and CHA2DS2-VASc scores did not perform as well as formal bleeding risk scores for prediction of major bleeding in non-valvular atrial fibrillation patients treated with warfarin. All three bleeding risk scores (HAS-BLED, ATRIA and HEMORR2HAGES) performed moderately well. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Does labour epidural slow the progress of labour and lead to complications? Obstetricians' perception working in private and public sector teaching hospitals in a developing country.

    PubMed

    Sohaib, Muhammad; Ismail, Samina

    2015-12-01

    Obstetricians play a major role in the decision making for provision of analgesia for the woman in labour. As epidural analgesia (EA) is the most preferred technique, it is important to know obstetricians' perception regarding its effect on progress of labour and associated complications. The 6 months cross-sectional study included 114 obstetricians from teaching hospitals. After informed consent, obstetricians were asked to fill a predesigned questionnaire containing 13 close ended questions regarding their perception on the effect of EA on progress of labour, EA complications and whether they would recommend EA to their patients or not. Other variables included age, gender, training in EA, practice type and hospital settings (private or public sector). Majority of the obstetricians had the perception of EA prolonging the first stage (89.5%) and second stage (98.2%) of labour, increasing the rate of caesarean section (87.7%), instrumental delivery (58.8%) and increasing the incidence of backache (85.5%). None of the obstetricians received any formal training in EA. Majority (84.2%) were not sure if they would recommend EA to their patients. When these responses were compared between public and private sector, a statistically higher percentage (P < 0.001) of public sector obstetricians had negative perception of EA. Perception of obstetrician regarding EA is contrary to the current evidence. There is a need to introduce formal curriculum on EA in obstetric training program and conduct regular refresher courses.

  6. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.

  7. Educational principles and techniques for interpreters.

    Treesearch

    F. David Boulanger; John P. Smith

    1973-01-01

    Interpretation is in large part education, since it attempts to convey information, concepts, and principles while creating attitude changes and such emotional states as wonder, delight, and appreciation. Although interpreters might profit greatly by formal training in the principles and techniques of teaching, many have not had such training. Some means of making the...

  8. Imagery mnemonics and memory remediation.

    PubMed

    Richardson, J T

    1992-02-01

    This paper evaluates the claim that imagery mnemonic techniques are useful in remediation of memory disorders in brain-damaged patients. Clinical research has confirmed that such techniques can lead to improved performance on formal testing in a number of neurologic disease populations and following lesions of either the left or right hemisphere. However, those patients with more severe forms of amnesia and those with medial or bilateral damage do not improve unless the learning task is highly structured. Even among patients who show improvement on formal testing, there is little evidence that they maintain the use of these techniques in similar learning tasks or generalize the use to new learning situations. Imagery mnemonics also appear to be of little practical value in the daily activities that are of most concern to brain-damaged patients themselves. The effectiveness of imagery mnemonics appears to depend upon the patients' motivation and insight rather than upon their intelligence or educational level. Instead of training patients in specific mnemonic techniques, clinicians should promote the development of "meta-cognitive" skills and the acquisition of knowledge about domains of practical significance.

  9. Formal and Informal Learning and First-Year Psychology Students’ Development of Scientific Thinking: A Two-Wave Panel Study

    PubMed Central

    Soyyılmaz, Demet; Griffin, Laura M.; Martín, Miguel H.; Kucharský, Šimon; Peycheva, Ekaterina D.; Vaupotič, Nina; Edelsbrunner, Peter A.

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students’ development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students’ need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students’ learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students’ scientific thinking. PMID:28239363

  10. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    NASA Astrophysics Data System (ADS)

    dell'Anno, Fabio; de Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n -mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherence and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [

    F. Dell’Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)
    ], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.

  11. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n-mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherencemore » and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [F. Dell'Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.« less

  12. Formal and Informal Learning and First-Year Psychology Students' Development of Scientific Thinking: A Two-Wave Panel Study.

    PubMed

    Soyyılmaz, Demet; Griffin, Laura M; Martín, Miguel H; Kucharský, Šimon; Peycheva, Ekaterina D; Vaupotič, Nina; Edelsbrunner, Peter A

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students' development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students' need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students' learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students' scientific thinking.

  13. A spatial exploration of informal trail networks within Great Falls Park, VA

    USGS Publications Warehouse

    Wimpey, Jeremy; Marion, Jeffrey L.

    2011-01-01

    Informal (visitor-created) trails represent a threat to the natural resources of protected natural areas around the globe. These trails can remove vegetation, displace wildlife, alter hydrology, alter habitat, spread invasive species, and fragment landscapes. This study examines informal and formal trails within Great Falls Park, VA, a sub-unit of the George Washington Memorial Parkway, managed by the U.S. National Park Service. This study sought to answer three specific questions: 1) Are the physical characteristics and topographic alignments of informal trails significantly different from formal trails, 2) Can landscape fragmentation metrics be used to summarize the relative impacts of formal and informal trail networks on a protected natural area? and 3) What can we learn from examining the spatial distribution of the informal trails within protected natural areas? Statistical comparisons between formal and informal trails in this park indicate that informal trails have less sustainable topographic alignments than their formal counterparts. Spatial summaries of the lineal and areal extent and fragmentation associated with the trail networks by park management zones compare park management goals to the assessed attributes. Hot spot analyses highlight areas of high trail density within the park and findings provide insights regarding potential causes for development of dense informal trail networks.

  14. SHER: a colored petri net based random mobility model for wireless communications.

    PubMed

    Khan, Naeem Akhtar; Ahmad, Farooq; Khan, Sher Afzal

    2015-01-01

    In wireless network research, simulation is the most imperative technique to investigate the network's behavior and validation. Wireless networks typically consist of mobile hosts; therefore, the degree of validation is influenced by the underlying mobility model, and synthetic models are implemented in simulators because real life traces are not widely available. In wireless communications, mobility is an integral part while the key role of a mobility model is to mimic the real life traveling patterns to study. The performance of routing protocols and mobility management strategies e.g. paging, registration and handoff is highly dependent to the selected mobility model. In this paper, we devise and evaluate the Show Home and Exclusive Regions (SHER), a novel two-dimensional (2-D) Colored Petri net (CPN) based formal random mobility model, which exhibits sociological behavior of a user. The model captures hotspots where a user frequently visits and spends time. Our solution eliminates six key issues of the random mobility models, i.e., sudden stops, memoryless movements, border effect, temporal dependency of velocity, pause time dependency, and speed decay in a single model. The proposed model is able to predict the future location of a mobile user and ultimately improves the performance of wireless communication networks. The model follows a uniform nodal distribution and is a mini simulator, which exhibits interesting mobility patterns. The model is also helpful to those who are not familiar with the formal modeling, and users can extract meaningful information with a single mouse-click. It is noteworthy that capturing dynamic mobility patterns through CPN is the most challenging and virulent activity of the presented research. Statistical and reachability analysis techniques are presented to elucidate and validate the performance of our proposed mobility model. The state space methods allow us to algorithmically derive the system behavior and rectify the errors of our proposed model.

  15. SHER: A Colored Petri Net Based Random Mobility Model for Wireless Communications

    PubMed Central

    Khan, Naeem Akhtar; Ahmad, Farooq; Khan, Sher Afzal

    2015-01-01

    In wireless network research, simulation is the most imperative technique to investigate the network’s behavior and validation. Wireless networks typically consist of mobile hosts; therefore, the degree of validation is influenced by the underlying mobility model, and synthetic models are implemented in simulators because real life traces are not widely available. In wireless communications, mobility is an integral part while the key role of a mobility model is to mimic the real life traveling patterns to study. The performance of routing protocols and mobility management strategies e.g. paging, registration and handoff is highly dependent to the selected mobility model. In this paper, we devise and evaluate the Show Home and Exclusive Regions (SHER), a novel two-dimensional (2-D) Colored Petri net (CPN) based formal random mobility model, which exhibits sociological behavior of a user. The model captures hotspots where a user frequently visits and spends time. Our solution eliminates six key issues of the random mobility models, i.e., sudden stops, memoryless movements, border effect, temporal dependency of velocity, pause time dependency, and speed decay in a single model. The proposed model is able to predict the future location of a mobile user and ultimately improves the performance of wireless communication networks. The model follows a uniform nodal distribution and is a mini simulator, which exhibits interesting mobility patterns. The model is also helpful to those who are not familiar with the formal modeling, and users can extract meaningful information with a single mouse-click. It is noteworthy that capturing dynamic mobility patterns through CPN is the most challenging and virulent activity of the presented research. Statistical and reachability analysis techniques are presented to elucidate and validate the performance of our proposed mobility model. The state space methods allow us to algorithmically derive the system behavior and rectify the errors of our proposed model. PMID:26267860

  16. Who Cares? Infant Educators' Responses to Professional Discourses of Care

    ERIC Educational Resources Information Center

    Davis, Belinda; Degotardi, Sheila

    2015-01-01

    This paper explores the construction of "care" in early childhood curriculum and practice. An increasing number of infants are attending formal early childhood settings in Australia (Australian Bureau of Statistics, 2011. "Childhood education and care, Australia, June 2011." (4402.0). Retrieved from…

  17. Tertiary Education and Training in Australia, 2010

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2012

    2012-01-01

    This publication presents information on tertiary education and training during 2010, including statistics on participation and outcomes. The definition of tertiary education and training adopted for this publication is formal study in vocational education and training (VET) and higher education, including enrolments in Australian Qualifications…

  18. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  19. Robot-assisted laparoscopic skills development: formal versus informal training.

    PubMed

    Benson, Aaron D; Kramer, Brandan A; Boehler, Margaret; Schwind, Cathy J; Schwartz, Bradley F

    2010-08-01

    The learning curve for robotic surgery is not completely defined, and ideal training components have not yet been identified. We attempted to determine whether skill development would be accelerated with formal, organized instruction in robotic surgical techniques versus informal practice alone. Forty-three medical students naive to robotic surgery were randomized into two groups and tested on three tasks using the robotic platform. Between the testing sessions, the students were given equally timed practice sessions. The formal training group participated in an organized, formal training session with instruction from an attending robotic surgeon, whereas the informal training group participated in an equally timed unstructured practice session with the robot. The results were compared based on technical score and time to completion of each task. There was no difference between groups in prepractice testing for any task. In postpractice testing, there was no difference between groups for the ring transfer tasks. However, for the suture placement and knot-tying task, the technical score of the formal training group was significantly better than that of the informal training group (p < 0.001), yet time to completion was not different. Although formal training may not be necessary for basic skills, formal instruction for more advanced skills, such as suture placement and knot tying, is important in developing skills needed for effective robotic surgery. These findings may be important in formulating potential skills labs or training courses for robotic surgery.

  20. Use of healthcare services by injured people in Khartoum State, Sudan.

    PubMed

    El Tayeb, Sally; Abdalla, Safa; Van den Bergh, Graziella; Heuch, Ivar

    2015-05-01

    Trauma care is an important factor in preventing death and reducing disability. Injured persons in low- and middle-income countries are expected to use the formal healthcare system in increasing numbers. The objective of this paper is to examine use of healthcare services after injury in Khartoum State, Sudan. A community-based survey using a stratified two-stage cluster sampling technique in Khartoum State was performed. Information on healthcare utilisation was taken from injured people. A logistic regression analysis was used to explore factors affecting the probability of using formal healthcare services. During the 12 months preceding the survey a total of 441 cases of non-fatal injuries occurred, with 260 patients accessing formal healthcare. About a quarter of the injured persons were admitted to hospital. Injured people with primary education were less likely to use formal healthcare compared to those with no education. Formal health services were most used by males and in cases of road traffic injuries. The lowest socio-economic strata were least likely to use formal healthcare. Public health measures and social security should be strengthened by identifying other real barriers that prevent low socio-economic groups from making use of formal healthcare facilities. Integration and collaboration with traditional orthopaedic practitioners are important aspects that need further attention. © The Author 2014. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  1. Large deviation principle at work: Computation of the statistical properties of the exact one-point aperture mass

    NASA Astrophysics Data System (ADS)

    Reimberg, Paulo; Bernardeau, Francis

    2018-01-01

    We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.

  2. Apes are intuitive statisticians.

    PubMed

    Rakoczy, Hannes; Clüver, Annette; Saucke, Liane; Stoffregen, Nicole; Gräbener, Alice; Migura, Judith; Call, Josep

    2014-04-01

    Inductive learning and reasoning, as we use it both in everyday life and in science, is characterized by flexible inferences based on statistical information: inferences from populations to samples and vice versa. Many forms of such statistical reasoning have been found to develop late in human ontogeny, depending on formal education and language, and to be fragile even in adults. New revolutionary research, however, suggests that even preverbal human infants make use of intuitive statistics. Here, we conducted the first investigation of such intuitive statistical reasoning with non-human primates. In a series of 7 experiments, Bonobos, Chimpanzees, Gorillas and Orangutans drew flexible statistical inferences from populations to samples. These inferences, furthermore, were truly based on statistical information regarding the relative frequency distributions in a population, and not on absolute frequencies. Intuitive statistics in its most basic form is thus an evolutionarily more ancient rather than a uniquely human capacity. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    PubMed

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  4. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. I. Formal theory

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We derive a formal theory of noisy Poisson processes with multiple outcomes. We obtain simple, compact expressions for the probability distribution function of arbitrarily complex composite events and its moments. We illustrate the utility of the theory by analyzing properties of coincidence and covariance photoelectron-photoion detection involving single-ionization events. The results and techniques introduced in this work are directly applicable to more general coincidence and covariance experiments, including multiple ionization and multiple-ion fragmentation pathways.

  5. Informal and formal trail monitoring protocols and baseline conditions: Acadia National Park

    USGS Publications Warehouse

    Marion, Jeffrey L.; Wimpey, Jeremy F.; Park, L.

    2011-01-01

    At Acadia National Park, changing visitor use levels and patterns have contributed to an increasing degree of visitor use impacts to natural and cultural resources. To better understand the extent and severity of these resource impacts and identify effective management techniques, the park sponsored this research to develop monitoring protocols, collect baseline data, and identify suggestions for management strategies. Formal and informal trails were surveyed and their resource conditions were assessed and characterized to support park planning and management decision-making.

  6. Impedance spectroscopy and electric modulus behavior of Molybdenum doped Cobalt-Zinc ferrite

    NASA Astrophysics Data System (ADS)

    Pradhan, A. K.; Nath, T. K.; Saha, S.

    2017-07-01

    The complex impedance spectroscopy and the electric modulus of Mo doped Cobalt-Zinc inverse spinel ferrite has been investigated in detail. The conventional ceramic technique has been used to prepare the CZMO. The HRXRD technique has been used to study the structural analysis which confirms the inverse spinel structure of the material and also suggest the material have Fd3m space group. The complex impedance spectroscopic data and the electric modulus formalism have been used to understand the dielectric relaxation and conduction process. The contribution of grain and grain boundary in the electrical conduction process of CZMO has been confirmed from the Cole-Cole plot. The activation energy is calculated from both the IS (Impedance Spectroscopy) and electric modulus formalism and found to be nearly same for the materials.

  7. Analysis and improvements of Adaptive Particle Refinement (APR) through CPU time, accuracy and robustness considerations

    NASA Astrophysics Data System (ADS)

    Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.

    2018-02-01

    While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.

  8. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  9. On the Formal Verification of Conflict Detection Algorithms

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Butler, Ricky W.; Carreno, Victor A.; Dowek, Gilles

    2001-01-01

    Safety assessment of new air traffic management systems is a main issue for civil aviation authorities. Standard techniques such as testing and simulation have serious limitations in new systems that are significantly more autonomous than the older ones. In this paper, we present an innovative approach, based on formal verification, for establishing the correctness of conflict detection systems. Fundamental to our approach is the concept of trajectory, which is a continuous path in the x-y plane constrained by physical laws and operational requirements. From the Model of trajectories, we extract, and formally prove, high level properties that can serve as a framework to analyze conflict scenarios. We use the Airborne Information for Lateral Spacing (AILS) alerting algorithm as a case study of our approach.

  10. Accurate 238U(n , 2 n )237U reaction cross-section measurements from 6.5 to 14.8 MeV

    NASA Astrophysics Data System (ADS)

    Krishichayan, Bhike, M.; Tornow, W.; Tonchev, A. P.; Kawano, T.

    2017-10-01

    The cross section for the 238U(n ,2 n )237U reaction has been measured in the incident neutron energy range from 6.5 to 14.8 MeV in small energy steps using an activation technique. Monoenergetic neutron beams were produced via the 2H(d ,n )3He and 3H(d ,n )4He reactions. 238U targets were activated along with Au and Al monitor foils to determine the incident neutron flux. The activity of the reaction products was measured in TUNL's low-background counting facility using high-resolution γ -ray spectroscopy. The results are compared with previous measurements and latest data evaluations. Statistical-model calculations, based on the Hauser-Feshbach formalism, have been carried out using the CoH3 code and are compared with the experimental results. The present self-consistent and high-quality data are important for stockpile stewardship and nuclear forensic purposes as well as for the design and operation of fast reactors.

  11. Statistical moments in superposition models and strongly intensive measures

    NASA Astrophysics Data System (ADS)

    Broniowski, Wojciech; Olszewski, Adam

    2017-06-01

    First, we present a concise glossary of formulas for composition of standard, cumulant, factorial, and factorial cumulant moments in superposition (compound) models, where final particles are created via independent emission from a collection of sources. Explicit mathematical formulas for the composed moments are given to all orders. We discuss the composition laws for various types of moments via the generating-function methods and list the formulas for the unfolding of the unwanted fluctuations. Second, the technique is applied to the difference of the scaled multiplicities of two particle types. This allows for a systematic derivation and a simple algebraic interpretation of the so-called strongly intensive fluctuation measures. With the help of the formalism we obtain several new strongly intensive measures involving higher-rank moments. The reviewed as well as the new results may be useful in investigations of mechanisms of particle production and event-by-event fluctuations in high-energy nuclear and hadronic collisions, and in particular in the search for signatures of the QCD phase transition at a finite baryon density.

  12. Big data to smart data in Alzheimer's disease: Real-world examples of advanced modeling and simulation.

    PubMed

    Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo

    2016-09-01

    Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Error tolerance analysis of wave diagnostic based on coherent modulation imaging in high power laser system

    NASA Astrophysics Data System (ADS)

    Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang

    2018-02-01

    Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.

  14. Impacts of extension access and cooperative membership on technology adoption and household welfare.

    PubMed

    Wossen, Tesfamicheal; Abdoulaye, Tahirou; Alene, Arega; Haile, Mekbib G; Feleke, Shiferaw; Olanrewaju, Adetunji; Manyong, Victor

    2017-08-01

    This paper examines the impacts of access to extension services and cooperative membership on technology adoption, asset ownership and poverty using household-level data from rural Nigeria. Using different matching techniques and endogenous switching regression approach, we find that both extension access and cooperative membership have a positive and statistically significant effect on technology adoption and household welfare. Moreover, we find that both extension access and cooperative membership have heterogeneous impacts. In particular, we find evidence of a positive selection as the average treatment effects of extension access and cooperative membership are higher for farmers with the highest propensity to access extension and cooperative services. The impact of extension services on poverty reduction and of cooperatives on technology adoption is significantly stronger for smallholders with access to formal credit than for those without access. This implies that expanding rural financial markets can maximize the potential positive impacts of extension and cooperative services on farmers' productivity and welfare.

  15. Breast milk donation and social support: reports of women donors.

    PubMed

    De Alencar, Lucienne Christine Estevez; Seidl, Eliane Maria Fleury

    2010-01-01

    The study aimed to characterize the behavior of human milk donation and to describe the informal social and formal institutional support, according to reports from women donors. It is an exploratory, cross-sectional, descriptive study using domicile interviews based on structured and semi-structured scripts. The participants were 36 women enrolled in two human milk banks of the public health system of the Federal District. Statistical analysis of quantitative data and categorical content analysis of qualitative data were performed. Categories of reasons that most influenced the frequency of expressing were: food, time availability, negative emotions and fluid intake. The manual expressing technique was reported as predominant. The use of breast shells was cited by almost a third of the donors. Most frequent suggestions for improving institutional support were more attention and support from the milk banks for the donor. The study may serve as a stimulus for the implementation of technical and political strategies to encourage this practice.

  16. Geoscience Education Research Methods: Thinking About Sample Size

    NASA Astrophysics Data System (ADS)

    Slater, S. J.; Slater, T. F.; CenterAstronomy; Physics Education Research

    2011-12-01

    Geoscience education research is at a critical point in which conditions are sufficient to propel our field forward toward meaningful improvements in geosciences education practices. Our field has now reached a point where the outcomes of our research is deemed important to endusers and funding agencies, and where we now have a large number of scientists who are either formally trained in geosciences education research, or who have dedicated themselves to excellence in this domain. At this point we now must collectively work through our epistemology, our rules of what methodologies will be considered sufficiently rigorous, and what data and analysis techniques will be acceptable for constructing evidence. In particular, we have to work out our answer to that most difficult of research questions: "How big should my 'N' be??" This paper presents a very brief answer to that question, addressing both quantitative and qualitative methodologies. Research question/methodology alignment, effect size and statistical power will be discussed, in addition to a defense of the notion that bigger is not always better.

  17. Transformation of general binary MRF minimization to the first-order case.

    PubMed

    Ishikawa, Hiroshi

    2011-06-01

    We introduce a transformation of general higher-order Markov random field with binary labels into a first-order one that has the same minima as the original. Moreover, we formalize a framework for approximately minimizing higher-order multi-label MRF energies that combines the new reduction with the fusion-move and QPBO algorithms. While many computer vision problems today are formulated as energy minimization problems, they have mostly been limited to using first-order energies, which consist of unary and pairwise clique potentials, with a few exceptions that consider triples. This is because of the lack of efficient algorithms to optimize energies with higher-order interactions. Our algorithm challenges this restriction that limits the representational power of the models so that higher-order energies can be used to capture the rich statistics of natural scenes. We also show that some minimization methods can be considered special cases of the present framework, as well as comparing the new method experimentally with other such techniques.

  18. Measuring and monitoring biological diversity: Standard methods for mammals

    USGS Publications Warehouse

    Wilson, Don E.; Cole, F. Russell; Nichols, James D.; Rudran, Rasanayagam; Foster, Mercedes S.

    1996-01-01

    Measuring and Monitoring Biological Diversity: Standard Methods for Mammals provides a comprehensive manual for designing and implementing inventories of mammalian biodiversity anywhere in the world and for any group, from rodents to open-country grazers. The book emphasizes formal estimation approaches, which supply data that can be compared across habitats and over time. Beginning with brief natural histories of the twenty-six orders of living mammals, the book details the field techniques—observation, capture, and sign interpretation—appropriate to different species. The contributors provide guidelines for study design, discuss survey planning, describe statistical techniques, and outline methods of translating field data into electronic formats. Extensive appendixes address such issues as the ethical treatment of animals in research, human health concerns, preserving voucher specimens, and assessing age, sex, and reproductive condition in mammals.Useful in both developed and developing countries, this volume and the Biological Diversity Handbook Series as a whole establish essential standards for a key aspect of conservation biology and resource management.

  19. A quantum perturbative pair distribution for determining interatomic potentials from extended x-ray absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Piazza, F.

    2002-11-01

    In this paper we develop a technique for determining interatomic potentials in materials in the quantum regime from single-shell extended x-ray absorption spectroscopy (EXAFS) spectra. We introduce a pair distribution function, based on ordinary quantum time-independent perturbation theory. In the proposed scheme, the model potential parameters enter the distribution through a fourth-order Taylor expansion of the potential, and are directly refined in the fit of the model signal to the experimental spectrum. We discuss in general the validity of our theoretical framework, namely the quantum regime and perturbative treatment, and work out a simple tool for monitoring the sensitivity of our theory in determining lattice anharmonicities based on the statistical F-test. As an example, we apply our formalism to an EXAFS spectrum at the Ag K edge of AgI at T = 77 K. We determine the Ag-I potential parameters and find good agreement with previous studies.

  20. Model selection and assessment for multi­-species occupancy models

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.

    2016-01-01

    While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.

  1. Zipf's law from scale-free geometry.

    PubMed

    Lin, Henry W; Loeb, Abraham

    2016-03-01

    The spatial distribution of people exhibits clustering across a wide range of scales, from household (∼10(-2) km) to continental (∼10(4) km) scales. Empirical data indicate simple power-law scalings for the size distribution of cities (known as Zipf's law) and the population density fluctuations as a function of scale. Using techniques from random field theory and statistical physics, we show that these power laws are fundamentally a consequence of the scale-free spatial clustering of human populations and the fact that humans inhabit a two-dimensional surface. In this sense, the symmetries of scale invariance in two spatial dimensions are intimately connected to urban sociology. We test our theory by empirically measuring the power spectrum of population density fluctuations and show that the logarithmic slope α=2.04 ± 0.09, in excellent agreement with our theoretical prediction α=2. The model enables the analytic computation of many new predictions by importing the mathematical formalism of random fields.

  2. Medical statistics and hospital medicine: the case of the smallpox vaccination.

    PubMed

    Rusnock, Andrea

    2007-01-01

    Between 1799 and 1806, trials of vaccination to determine its safety and efficacy were undertaken in hospitals in London, Paris, Vienna, and Boston. These trials were among the first instances of formal hospital evaluations of a medical procedure and signal a growing acceptance of a relatively new approach to medical practice. These early evaluations of smallpox vaccination also relied on descriptive and quantitative accounts, as well as probabilistic analyses, and thus occupy a significant, yet hitherto unexamined, place in the history of medical statistics.

  3. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  4. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  5. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  6. [Formal sample size calculation and its limited validity in animal studies of medical basic research].

    PubMed

    Mayer, B; Muche, R

    2013-01-01

    Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.

  7. Collective Bargaining: Its Impact on Educational Cost.

    ERIC Educational Resources Information Center

    Atherton, P. J.

    Since the Ontario (Canada) legislation in 1975 that formalized collective bargaining for teachers, public concern has focused on collective bargaining as the possible cause of recent enrollment declines and increases in schooling costs. However, according to Ontario provincial statistics, enrollment in elementary schools had begun to decline…

  8. Heuristic Elements of Plausible Reasoning.

    ERIC Educational Resources Information Center

    Dudczak, Craig A.

    At least some of the reasoning processes involved in argumentation rely on inferences which do not fit within the traditional categories of inductive or deductive reasoning. The reasoning processes involved in plausibility judgments have neither the formal certainty of deduction nor the imputed statistical probability of induction. When utilizing…

  9. Building Intuitions about Statistical Inference Based on Resampling

    ERIC Educational Resources Information Center

    Watson, Jane; Chance, Beth

    2012-01-01

    Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on…

  10. LSD Now: 1973

    ERIC Educational Resources Information Center

    Chunko, John A.

    1973-01-01

    LSD NOW is a nationwide, statistical survey and analysis of hallucinogenic drug use by individuals presently in formal educational surroundings. Analysis, concentrating on the extent and rationale related to the use of such drugs, now offers a deeper and more meaningful understanding of a particular facet of the drug culture. This understanding…

  11. Confirmatory and Competitive Evaluation of Alternative Gene-Environment Interaction Hypotheses

    ERIC Educational Resources Information Center

    Belsky, Jay; Pluess, Michael; Widaman, Keith F.

    2013-01-01

    Background: Most gene-environment interaction (GXE) research, though based on clear, vulnerability-oriented hypotheses, is carried out using exploratory rather than hypothesis-informed statistical tests, limiting power and making formal evaluation of competing GXE propositions difficult. Method: We present and illustrate a new regression technique…

  12. Love and Sex: Can We Talk About That in School?

    ERIC Educational Resources Information Center

    Vance, Paul C.

    1985-01-01

    Gives statistical information on the "national epidemic" of teenage sexual activity and pregnancy and its consequences. Discusses social causes of this problem. Proposes that schools can help solve the problem by providing a formal sex education curriculum for pupils in kindergarten through grade 12. (CB)

  13. Approaching Bose-Einstein Condensation

    ERIC Educational Resources Information Center

    Ferrari, Loris

    2011-01-01

    Bose-Einstein condensation (BEC) is discussed at the level of an advanced course of statistical thermodynamics, clarifying some formal and physical aspects that are usually not covered by the standard pedagogical literature. The non-conventional approach adopted starts by showing that the continuum limit, in certain cases, cancels out the crucial…

  14. Impact of Managerial Skills Learnt through MA Educational Planning Management Programme of AIOU on the Performance of Institutional Heads

    ERIC Educational Resources Information Center

    Chuadhry, Muhammad Asif; Shah, Syed Manzoor Hussain

    2012-01-01

    Management provides formal coordination in an organization for achieving pre-determined goals. The educational manager particulary performs his duties by using different planning and management techniques. These techniques are equally important for the manager of other sectors. The present study was focused on the impact of managerial skills…

  15. Lee Silverman Voice Treatment for People with Parkinson's: Audit of Outcomes in a Routine Clinic

    ERIC Educational Resources Information Center

    Wight, Sheila; Miller, Nick

    2015-01-01

    Background: Speaking louder/more intensely represents a longstanding technique employed to manage voice and intelligibility changes in people with Parkinson's. This technique has been formalized into a treatment approach and marketed as the Lee Silverman Voice Treatment (LSVT®) programme. Evidence for its efficacy has been published. Studies…

  16. Elicitation Techniques: Getting People to Talk about Ideas They Don't Usually Talk About

    ERIC Educational Resources Information Center

    Barton, Keith C.

    2015-01-01

    Elicitation techniques are a category of research tasks that use visual, verbal, or written stimuli to encourage participants to talk about their ideas. These tasks are particularly useful for exploring topics that may be difficult to discuss in formal interviews, such as those that involve sensitive issues or rely on tacit knowledge. Elicitation…

  17. A Delphi Study on Staff Bereavement Training in the Intellectual and Developmental Disabilities Field

    ERIC Educational Resources Information Center

    Gray, Jennifer A.; Truesdale, Jesslyn

    2015-01-01

    The Delphi technique was used to obtain expert panel consensus to prioritize content areas and delivery methods for developing staff grief and bereavement curriculum training in the intellectual and developmental disabilities (IDD) field. The Delphi technique was conducted with a panel of 18 experts from formal and informal disability caregiving,…

  18. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  19. Statistical science: a grammar for research.

    PubMed

    Cox, David R

    2017-06-01

    I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.

  20. On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Ground-based Coronagraphs

    PubMed Central

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2015-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012. PMID:26347393

  1. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-Based Coronagraphs

    NASA Technical Reports Server (NTRS)

    Lawson, Peter R.; Frazin, Richard; Barrett, Harrison; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gladysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jerome; hide

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We provide a formal comparison of techniques through a blind data challenge and evaluate performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  2. On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Ground-based Coronagraphs.

    PubMed

    Lawson, Peter R; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2012-07-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  3. On advanced estimation techniques for exoplanet detection and characterization using ground-based coronagraphs

    NASA Astrophysics Data System (ADS)

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2012-07-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  4. EFFICIENTLY ESTABLISHING CONCEPTS OF INFERENTIAL STATISTICS AND HYPOTHESIS DECISION MAKING THROUGH CONTEXTUALLY CONTROLLED EQUIVALENCE CLASSES

    PubMed Central

    Fienup, Daniel M; Critchfield, Thomas S

    2010-01-01

    Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904

  5. Testing the statistical compatibility of independent data sets

    NASA Astrophysics Data System (ADS)

    Maltoni, M.; Schwetz, T.

    2003-08-01

    We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed.

  6. Does Formal Research Training Lead to Academic Success in Plastic Surgery? A Comprehensive Analysis of U.S. Academic Plastic Surgeons.

    PubMed

    Lopez, Joseph; Ameri, Afshin; Susarla, Srinivas M; Reddy, Sashank; Soni, Ashwin; Tong, J W; Amini, Neda; Ahmed, Rizwan; May, James W; Lee, W P Andrew; Dorafshar, Amir

    2016-01-01

    It is currently unknown whether formal research training has an influence on academic advancement in plastic surgery. The purpose of this study was to determine whether formal research training was associated with higher research productivity, academic rank, and procurement of extramural National Institutes of Health (NIH) funding in plastic surgery, comparing academic surgeons who completed said research training with those without. This was a cross-sectional study of full-time academic plastic surgeons in the United States. The main predictor variable was formal research training, defined as completion of a postdoctoral research fellowship or attainment of a Doctor of Philosophy (PhD). The primary outcome was scientific productivity measured by the Hirsh-index (h-index, the number of publications, h that have at least h citations each). The secondary outcomes were academic rank and NIH funding. Descriptive, bivariate, and multiple regression statistics were computed. A total of 607 academic surgeons were identified from 94 Accreditation Council for Graduate Medical Education-accredited plastic surgery training programs. In all, 179 (29.5%) surgeons completed formal research training. The mean h-index was 11.7 ± 9.9. And, 58 (9.6%) surgeons successfully procured NIH funding. The distribution of academic rank was the following: endowed professor (5.4%), professor (23.9%), associate professor (23.4%), assistant professor (46.0%), and instructor (1.3%). In a multiple regression analysis, completion of formal research training was significantly predictive of a higher h-index and successful procurement of NIH funding. Current evidence demonstrates that formal research training is associated with higher scientific productivity and increased likelihood of future NIH funding. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  7. An investigation of the effects of interventions on problem-solving strategies and abilities

    NASA Astrophysics Data System (ADS)

    Cox, Charles Terrence, Jr.

    Problem-solving has been described as being the "heart" of the chemistry classroom, and students' development of problem-solving skills is essential for their success in chemistry. Despite the importance of problem-solving, there has been little research within the chemistry domain, largely because of the lack of tools to collect data for large populations. Problem-solving was assessed using a software package known as IMMEX (for Interactive Multimedia Exercises) which has an HTML tracking feature that allows for collection of problem-solving data in the background as students work the problems. The primary goal of this research was to develop methods (known as interventions) that could promote improvements in students' problem-solving and most notably aid in their transition from the novice to competent level. Three intervention techniques that were incorporated within the chemistry curricula: collaborative grouping (face-to-face and distance), concept mapping, and peer-led team learning. The face-to-face collaborative grouping intervention was designed to probe the factors affecting the quality of the group interaction. Students' logical reasoning abilities were measured using the Group Assessment of Logical Thinking (GALT) test which classifies students as formal, transitional, or concrete. These classifications essentially provide a basis for identifying scientific aptitude. These designations were used as the basis for forming collaborative groups of two students. The six possibilities (formal-formal, formal-transitional, etc.) were formed to determine how the group composition influences the gains in student abilities observed from collaborative grouping interventions. Students were given three assignments (an individual pre-collaborative, an individual post collaborative, and a collaborative assignment) each requiring them to work an IMMEX problem set. Similar gains in performance of 10% gains were observed for each group with two exceptions. The transitional students who were paired with concrete students had a 15% gain, and the concrete students paired with other concrete students had only a marginal gain. In fact, there was no statistical difference in the pre-collaborative and post-collaborative student abilities for concrete-concrete groups. The distance collaborative intervention was completed using a new interface for the IMMEX software designed to mimic face-to-face collaboration. A stereochemistry problem set which had a solved rate of 28% prior to collaboration was chosen for incorporation into this distance collaboration study. (Abstract shortened by UMI.)

  8. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  9. The Evolution of Organization Analysis in ASQ, 1959-1979.

    ERIC Educational Resources Information Center

    Daft, Richard L.

    1980-01-01

    During the period 1959-1979, a sharp trend toward low-variety statistical languages has taken place, which may represent an organizational mapping phase in which simple, quantifiable relationships have been formally defined and measured. A broader scope of research languages will be needed in the future. (Author/IRT)

  10. Beginning Teacher Induction: A Report on Beginning Teacher Effectiveness and Retention.

    ERIC Educational Resources Information Center

    Serpell, Zewelanji; Bozeman, Leslie A.

    National statistics show a rise in the number of beginning teachers undergoing formal induction in their first year of teaching. This report discusses the effectiveness of induction programs and resulting outcomes for beginning teacher retention, beginning teacher effectiveness, and mentor participation. The various components of induction…

  11. Statistical Knowledge and Learning in Phonology

    ERIC Educational Resources Information Center

    Dunbar, Ewan Michael

    2013-01-01

    This dissertation deals with the theory of the phonetic component of grammar in a formal probabilistic inference framework: (1) it has been recognized since the beginning of generative phonology that some language-specific phonetic implementation is actually context-dependent, and thus it can be said that there are gradient "phonetic…

  12. Mathematical Literacy--It's Become Fundamental

    ERIC Educational Resources Information Center

    McCrone, Sharon Soucy; Dossey, John A.

    2007-01-01

    The rising tide of numbers and statistics in daily life signals a need for a fundamental broadening of the concept of literacy: mathematical literacy assuming a coequal role in the curriculum alongside language-based literacy. Mathematical literacy is not about studying higher levels of formal mathematics, but about making math relevant and…

  13. Developing Sensitivity to Subword Combinatorial Orthographic Regularity (SCORe): A Two-Process Framework

    ERIC Educational Resources Information Center

    Mano, Quintino R.

    2016-01-01

    Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…

  14. Prison Clinicians' Perceptions of Antisocial Personality Disorder as a Formal Diagnosis.

    ERIC Educational Resources Information Center

    Stevens, Gail Flint

    1994-01-01

    Surveyed and interviewed 53 clinicians who work with prison inmates. Results indicated that clinicians used diagnosis of antisocial personality disorder liberally among inmates and felt majority of inmates could be so diagnosed. Large minority of clinicians went beyond Diagnostic and Statistical Manual of Mental Disorders criteria and reported…

  15. Ethical Reasoning Instruction in Non-Ethics Business Courses: A Non-Intrusive Approach

    ERIC Educational Resources Information Center

    Wilhelm, William J.

    2010-01-01

    This article discusses four confirmatory studies designed to corroborate findings from prior developmental research which yielded statistically significant improvements in student moral reasoning when specific instructional strategies and content materials were utilized in non-ethics business courses by instructors not formally trained in business…

  16. The Lay Concept of Childhood Mental Disorder

    ERIC Educational Resources Information Center

    Giummarra, Melita J.; Haslam, Nick

    2005-01-01

    The structure of lay people's concepts of childhood mental disorder was investigated in a questionnaire study and examined for convergence with the Diagnostic and Statistical Manual (DSM-IV). Eighty-four undergraduates who had no formal education in abnormal psychology rated 54 conditions--36 DSM-IV childhood disorders and 18 non-disorders--on…

  17. From Mere Coincidences to Meaningful Discoveries

    ERIC Educational Resources Information Center

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2007-01-01

    People's reactions to coincidences are often cited as an illustration of the irrationality of human reasoning about chance. We argue that coincidences may be better understood in terms of rational statistical inference, based on their functional role in processes of causal discovery and theory revision. We present a formal definition of…

  18. Structured Statistical Models of Inductive Reasoning

    ERIC Educational Resources Information Center

    Kemp, Charles; Tenenbaum, Joshua B.

    2009-01-01

    Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet…

  19. Evaluating Teachers and Schools Using Student Growth Models

    ERIC Educational Resources Information Center

    Schafer, William D.; Lissitz, Robert W.; Zhu, Xiaoshu; Zhang, Yuan; Hou, Xiaodong; Li, Ying

    2012-01-01

    Interest in Student Growth Modeling (SGM) and Value Added Modeling (VAM) arises from educators concerned with measuring the effectiveness of teaching and other school activities through changes in student performance as a companion and perhaps even an alternative to status. Several formal statistical models have been proposed for year-to-year…

  20. A Formal Valuation Framework for Emotions and Their Control.

    PubMed

    Huys, Quentin J M; Renz, Daniel

    2017-09-15

    Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  1. (Finite) statistical size effects on compressive strength.

    PubMed

    Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien

    2014-04-29

    The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.

  2. Verifying the interactive convergence clock synchronization algorithm using the Boyer-Moore theorem prover

    NASA Technical Reports Server (NTRS)

    Young, William D.

    1992-01-01

    The application of formal methods to the analysis of computing systems promises to provide higher and higher levels of assurance as the sophistication of our tools and techniques increases. Improvements in tools and techniques come about as we pit the current state of the art against new and challenging problems. A promising area for the application of formal methods is in real-time and distributed computing. Some of the algorithms in this area are both subtle and important. In response to this challenge and as part of an ongoing attempt to verify an implementation of the Interactive Convergence Clock Synchronization Algorithm (ICCSA), we decided to undertake a proof of the correctness of the algorithm using the Boyer-Moore theorem prover. This paper describes our approach to proving the ICCSA using the Boyer-Moore prover.

  3. External beam techniques to boost cervical cancer when brachytherapy is not an option—theories and applications

    PubMed Central

    Kilic, Sarah; Khan, Atif J.; Beriwal, Sushil; Small, William

    2017-01-01

    The management of locally advanced cervical cancer relies on brachytherapy (BT) as an integral part of the radiotherapy delivery armamentarium. Occasionally, intracavitary BT is neither possible nor available. In these circumstances, post-external beam radiotherapy (EBRT) interstitial brachytherapy and/or hysterectomy may represent viable options that must be adequately executed in a timely manner. However, if these options are not applicable due to patient related or facility related reasons, a formal contingency plan should be in place. Innovative EBRT techniques such as intensity modulated and stereotactic radiotherapy may be considered for patients unable to undergo brachytherapy. Relying on provocative arguments and recent data, this review explores the rationale for and limitations of non-brachytherapy substitutes in that setting aiming to establish a formal process for the optimal execution of this alternative plan. PMID:28603722

  4. The evolution of optics education at the U.S. National Optical Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Pompea, Stephen M.; Walker, Constance E.; Sparks, Robert T.

    2014-07-01

    The last decade of optics education at the U.S. National Optical Astronomy Observatory will be described in terms of program planning, assessment of community needs, identification of networks and strategic partners, the establishment of specific program goals and objectives, and program metrics and evaluation. A number of NOAO's optics education programs for formal and informal audiences will be described, including our Hands-On Optics program, illumination engineering/dark skies energy education programs, afterschool programs, adaptive optics education program, student outreach, and Galileoscope program. Particular emphasis will be placed on techniques for funding and sustaining high-quality programs. The use of educational gap analysis to identify the key needs of the formal and informal educational systems will be emphasized as a technique that has helped us to maximize our educational program effectiveness locally, regionally, nationally, and in Chile.

  5. Introduction of formal debate into a postgraduate specialty track education programme in periodontics in Japan.

    PubMed

    Saito, A; Fujinami, K

    2011-02-01

    To evaluate the formal debate as an active learning strategy within a postgraduate specialty track education programme in periodontics. A formal debate was implemented as an active learning strategy in the programme. The participants were full-time faculty, residents and dentists attending special courses at a teaching hospital in Japan. They were grouped into two evenly matched opposing teams, judges and audience. As a preparation for the debate, the participants attended a lecture on critical thinking. At the time of debate, each team provided a theme report with a list of references. Performances and contents of the debate were evaluated by the course instructors and audience. Pre- and post-debate testing was used to assess the participants' objective knowledge on clinical periodontology. Evaluation of the debate by the participants revealed that scores for criteria, such as presentation performance, response with logic and rebuttal effectiveness were relatively low. Thirty-eight per cent of the participants demonstrated higher test scores after the debate, although there was no statistically significant difference in the mean scores between pre- and post-tests. At the end of the debate, vast majority of participants recognised the significance and importance of the formal debate in the programme. It was suggested that the incorporation of the formal debate could serve as an educational tool for the postgraduate specialty track programme. © 2011 John Wiley & Sons A/S.

  6. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  7. Breaking the theoretical scaling limit for predicting quasiparticle energies: the stochastic GW approach.

    PubMed

    Neuhauser, Daniel; Gao, Yi; Arntsen, Christopher; Karshenas, Cyrus; Rabani, Eran; Baer, Roi

    2014-08-15

    We develop a formalism to calculate the quasiparticle energy within the GW many-body perturbation correction to the density functional theory. The occupied and virtual orbitals of the Kohn-Sham Hamiltonian are replaced by stochastic orbitals used to evaluate the Green function G, the polarization potential W, and, thereby, the GW self-energy. The stochastic GW (sGW) formalism relies on novel theoretical concepts such as stochastic time-dependent Hartree propagation, stochastic matrix compression, and spatial or temporal stochastic decoupling techniques. Beyond the theoretical interest, the formalism enables linear scaling GW calculations breaking the theoretical scaling limit for GW as well as circumventing the need for energy cutoff approximations. We illustrate the method for silicon nanocrystals of varying sizes with N_{e}>3000 electrons.

  8. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less

  10. Preferences for and Barriers to Formal and Informal Athletic Training Continuing Education Activities

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2011-01-01

    Context: Our previous research determined the frequency of participation and perceived effect of formal and informal continuing education (CE) activities. However, actual preferences for and barriers to CE must be characterized. Objective: To determine the types of formal and informal CE activities preferred by athletic trainers (ATs) and barriers to their participation in these activities. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographically stratified random sample of 1000 ATs, 427 ATs (42.7%) completed the survey. Main Outcome Measure(s): As part of a larger study, the Survey of Formal and Informal Athletic Training Continuing Education Activities (FIATCEA) was developed and administered electronically. The FIATCEA consists of demographic characteristics and Likert scale items (1 = strongly disagree, 5 = strongly agree) about preferred CE activities and barriers to these activities. Internal consistency of survey items, as determined by Cronbach α, was 0.638 for preferred CE activities and 0.860 for barriers to these activities. Descriptive statistics were computed for all items. Differences between respondent demographic characteristics and preferred CE activities and barriers to these activities were determined via analysis of variance and dependent t tests. The α level was set at .05. Results: Hands-on clinical workshops and professional networking were the preferred formal and informal CE activities, respectively. The most frequently reported barriers to formal CE were the cost of attending and travel distance, whereas the most frequently reported barriers to informal CE were personal and job-specific factors. Differences were noted between both the cost of CE and travel distance to CE and all other barriers to CE participation (F1,411 = 233.54, P < .001). Conclusions: Overall, ATs preferred formal CE activities. The same barriers (eg, cost, travel distance) to formal CE appeared to be universal to all ATs. Informal CE was highly valued by ATs because it could be individualized. PMID:22488195

  11. Formal and Informal Continuing Education Activities and Athletic Training Professional Practice

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2010-01-01

    Abstract Context: Continuing education (CE) is intended to promote professional growth and, ultimately, to enhance professional practice. Objective: To determine certified athletic trainers' participation in formal (ie, approved for CE credit) and informal (ie, not approved for CE credit) CE activities and the perceived effect these activities have on professional practice with regard to improving knowledge, clinical skills and abilities, attitudes toward patient care, and patient care itself. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographic, stratified random sample of 1000 athletic trainers, 427 (42.7%) completed the survey. Main Outcome Measure(s): The Survey of Formal and Informal Athletic Training Continuing Education Activities was developed and administered electronically. The survey consisted of demographic characteristics and Likert-scale items regarding CE participation and perceived effect of CE on professional practice. Internal consistency of survey items was determined using the Cronbach α (α  =  0.945). Descriptive statistics were computed for all items. An analysis of variance and dependent t tests were calculated to determine differences among respondents' demographic characteristics and their participation in, and perceived effect of, CE activities. The α level was set at .05. Results: Respondents completed more informal CE activities than formal CE activities. Participation in informal CE activities included reading athletic training journals (75.4%), whereas formal CE activities included attending a Board of Certification–approved workshop, seminar, or professional conference not conducted by the National Athletic Trainers' Association or affiliates or committees (75.6%). Informal CE activities were perceived to improve clinical skills or abilities and attitudes toward patient care. Formal CE activities were perceived to enhance knowledge. Conclusions: More respondents completed informal CE activities than formal CE activities. Both formal and informal CE activities were perceived to enhance athletic training professional practice. Informal CE activities should be explored and considered for CE credit. PMID:20446842

  12. "PowerPoint[R] Engagement" Techniques to Foster Deep Learning

    ERIC Educational Resources Information Center

    Berk, Ronald A.

    2011-01-01

    The purpose of this article is to describe a bunch of strategies with which teachers may already be familiar and, perhaps, use regularly, but not always in the context of a formal PowerPoint[R] presentation. Here are the author's top 10 engagement techniques that fit neatly within any version of PowerPoint[R]. Some of these may also be used with…

  13. Dyads versus Groups: Using Different Social Structures in Peer Review to Enhance Online Collaborative Learning Processes

    ERIC Educational Resources Information Center

    Pozzi, Francesca; Ceregini, Andrea; Ferlino, Lucia; Persico, Donatella

    2016-01-01

    The Peer Review (PR) is a very popular technique to support socio-constructivist and connectivist learning processes, online or face-to-face, at all educational levels, in both formal and informal contexts. The idea behind this technique is that sharing views and opinions with others by discussing with peers and receiving and providing formative…

  14. Standardized reporting of resection technique during nephron-sparing surgery: the surface-intermediate-base margin score.

    PubMed

    Minervini, Andrea; Carini, Marco; Uzzo, Robert G; Campi, Riccardo; Smaldone, Marc C; Kutikov, Alexander

    2014-11-01

    A standardized reporting system of nephron-sparing surgery resection techniques is lacking. The surface-intermediate-base scoring system represents a formal reporting instrument to assist in interpretation of reported data and to facilitate comparisons in the urologic literature. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  15. A review of geographic variation and Geographic Information Systems (GIS) applications in prescription drug use research.

    PubMed

    Wangia, Victoria; Shireman, Theresa I

    2013-01-01

    While understanding geography's role in healthcare has been an area of research for over 40 years, the application of geography-based analyses to prescription medication use is limited. The body of literature was reviewed to assess the current state of such studies to demonstrate the scale and scope of projects in order to highlight potential research opportunities. To review systematically how researchers have applied geography-based analyses to medication use data. Empiric, English language research articles were identified through PubMed and bibliographies. Original research articles were independently reviewed as to the medications or classes studied, data sources, measures of medication exposure, geographic units of analysis, geospatial measures, and statistical approaches. From 145 publications matching key search terms, forty publications met the inclusion criteria. Cardiovascular and psychotropic classes accounted for the largest proportion of studies. Prescription drug claims were the primary source, and medication exposure was frequently captured as period prevalence. Medication exposure was documented across a variety of geopolitical units such as countries, provinces, regions, states, and postal codes. Most results were descriptive and formal statistical modeling capitalizing on geospatial techniques was rare. Despite the extensive research on small area variation analysis in healthcare, there are a limited number of studies that have examined geographic variation in medication use. Clearly, there is opportunity to collaborate with geographers and GIS professionals to harness the power of GIS technologies and to strengthen future medication studies by applying more robust geospatial statistical methods. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Segmentation of fluorescence microscopy images for quantitative analysis of cell nuclear architecture.

    PubMed

    Russell, Richard A; Adams, Niall M; Stephens, David A; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S

    2009-04-22

    Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments.

  17. Segmentation of Fluorescence Microscopy Images for Quantitative Analysis of Cell Nuclear Architecture

    PubMed Central

    Russell, Richard A.; Adams, Niall M.; Stephens, David A.; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S.

    2009-01-01

    Abstract Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments. PMID:19383481

  18. Minimally invasive surgical technique for tethered surgical drains

    PubMed Central

    Hess, Shane R; Satpathy, Jibanananda; Waligora, Andrew C; Ugwu-Oju, Obinna

    2017-01-01

    A feared complication of temporary surgical drain placement is from the technical error of accidentally suturing the surgical drain into the wound. Postoperative discovery of a tethered drain can frequently necessitate return to the operating room if it cannot be successfully removed with nonoperative techniques. Formal wound exploration increases anesthesia and infection risk as well as cost and is best avoided if possible. We present a minimally invasive surgical technique that can avoid the morbidity associated with a full surgical wound exploration to remove a tethered drain when other nonoperative techniques fail. PMID:28400669

  19. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  20. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1996-01-01

    Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.

  1. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  2. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  3. Results of a Formal Methods Demonstration Project

    NASA Technical Reports Server (NTRS)

    Kelly, J.; Covington, R.; Hamilton, D.

    1994-01-01

    This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.

  4. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  5. Medication dispensing errors in Palestinian community pharmacy practice: a formal consensus using the Delphi technique.

    PubMed

    Shawahna, Ramzi; Haddad, Aseel; Khawaja, Baraa; Raie, Rand; Zaneen, Sireen; Edais, Tasneem

    2016-10-01

    Background Medication dispensing errors (MDEs) are frequent in community pharmacy practice. A definition of MDEs and scenarios representing MDE situations in Palestinian community pharmacy practice were not previously approached using formal consensus techniques. Objective This study was conducted to achieve consensus on a definition of MDEs and a wide range of scenarios that should or should not be considered as MDEs in Palestinian community pharmacy practice by a panel of community pharmacists. Setting Community pharmacy practice in Palestine. Method This was a descriptive study using the Delphi technique. A panel of fifty community pharmacists was recruited from different geographical locations of the West Bank of Palestine. A three round Delphi technique was followed to achieve consensus on a proposed definition of MDEs and 83 different scenarios representing potential MDEs using a nine-point scale. Main outcome measure Agreement or disagreement of a panel of community pharmacists on a proposed definition of MDEs and a series of scenarios representing potential MDEs. Results In the first Delphi round, views of key contact community pharmacists on MDEs were explored and situations representing potential MDEs were collected. In the second Delphi round, consensus was achieved to accept the proposed definition and to include 49 (59 %) of the 83 proposed scenarios as MDEs. In the third Delphi round, consensus was achieved to include further 13 (15.7 %) scenarios as MDEs, exclude 9 (10.8 %) scenarios and the rest of 12 (14.5 %) scenarios were considered equivocal based on the opinions of the panelists. Conclusion Consensus on a definition of MDEs and scenarios representing MDE situations in Palestinian community pharmacy practice was achieved using a formal consensus technique. The use of consensual definitions and scenarios representing MDE situations in community pharmacy practice might minimize methodological variations and their significant effects on the number and rate of MDEs reported in different studies.

  6. Single-Vector Calibration of Wind-Tunnel Force Balances

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; DeLoach, R.

    2003-01-01

    An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.

  7. Assessing atrophy measurement techniques in dementia: Results from the MIRIAD atrophy challenge.

    PubMed

    Cash, David M; Frost, Chris; Iheme, Leonardo O; Ünay, Devrim; Kandemir, Melek; Fripp, Jurgen; Salvado, Olivier; Bourgeat, Pierrick; Reuter, Martin; Fischl, Bruce; Lorenzi, Marco; Frisoni, Giovanni B; Pennec, Xavier; Pierson, Ronald K; Gunter, Jeffrey L; Senjem, Matthew L; Jack, Clifford R; Guizard, Nicolas; Fonov, Vladimir S; Collins, D Louis; Modat, Marc; Cardoso, M Jorge; Leung, Kelvin K; Wang, Hongzhi; Das, Sandhitsu R; Yushkevich, Paul A; Malone, Ian B; Fox, Nick C; Schott, Jonathan M; Ourselin, Sebastien

    2015-12-01

    Structural MRI is widely used for investigating brain atrophy in many neurodegenerative disorders, with several research groups developing and publishing techniques to provide quantitative assessments of this longitudinal change. Often techniques are compared through computation of required sample size estimates for future clinical trials. However interpretation of such comparisons is rendered complex because, despite using the same publicly available cohorts, the various techniques have been assessed with different data exclusions and different statistical analysis models. We created the MIRIAD atrophy challenge in order to test various capabilities of atrophy measurement techniques. The data consisted of 69 subjects (46 Alzheimer's disease, 23 control) who were scanned multiple (up to twelve) times at nine visits over a follow-up period of one to two years, resulting in 708 total image sets. Nine participating groups from 6 countries completed the challenge by providing volumetric measurements of key structures (whole brain, lateral ventricle, left and right hippocampi) for each dataset and atrophy measurements of these structures for each time point pair (both forward and backward) of a given subject. From these results, we formally compared techniques using exactly the same dataset. First, we assessed the repeatability of each technique using rates obtained from short intervals where no measurable atrophy is expected. For those measures that provided direct measures of atrophy between pairs of images, we also assessed symmetry and transitivity. Then, we performed a statistical analysis in a consistent manner using linear mixed effect models. The models, one for repeated measures of volume made at multiple time-points and a second for repeated "direct" measures of change in brain volume, appropriately allowed for the correlation between measures made on the same subject and were shown to fit the data well. From these models, we obtained estimates of the distribution of atrophy rates in the Alzheimer's disease (AD) and control groups and of required sample sizes to detect a 25% treatment effect, in relation to healthy ageing, with 95% significance and 80% power over follow-up periods of 6, 12, and 24months. Uncertainty in these estimates, and head-to-head comparisons between techniques, were carried out using the bootstrap. The lateral ventricles provided the most stable measurements, followed by the brain. The hippocampi had much more variability across participants, likely because of differences in segmentation protocol and less distinct boundaries. Most methods showed no indication of bias based on the short-term interval results, and direct measures provided good consistency in terms of symmetry and transitivity. The resulting annualized rates of change derived from the model ranged from, for whole brain: -1.4% to -2.2% (AD) and -0.35% to -0.67% (control), for ventricles: 4.6% to 10.2% (AD) and 1.2% to 3.4% (control), and for hippocampi: -1.5% to -7.0% (AD) and -0.4% to -1.4% (control). There were large and statistically significant differences in the sample size requirements between many of the techniques. The lowest sample sizes for each of these structures, for a trial with a 12month follow-up period, were 242 (95% CI: 154 to 422) for whole brain, 168 (95% CI: 112 to 282) for ventricles, 190 (95% CI: 146 to 268) for left hippocampi, and 158 (95% CI: 116 to 228) for right hippocampi. This analysis represents one of the most extensive statistical comparisons of a large number of different atrophy measurement techniques from around the globe. The challenge data will remain online and publicly available so that other groups can assess their methods. Copyright © 2015. Published by Elsevier Inc.

  8. Assessing atrophy measurement techniques in dementia: Results from the MIRIAD atrophy challenge

    PubMed Central

    Cash, David M.; Frost, Chris; Iheme, Leonardo O.; Ünay, Devrim; Kandemir, Melek; Fripp, Jurgen; Salvado, Olivier; Bourgeat, Pierrick; Reuter, Martin; Fischl, Bruce; Lorenzi, Marco; Frisoni, Giovanni B.; Pennec, Xavier; Pierson, Ronald K.; Gunter, Jeffrey L.; Senjem, Matthew L.; Jack, Clifford R.; Guizard, Nicolas; Fonov, Vladimir S.; Collins, D. Louis; Modat, Marc; Cardoso, M. Jorge; Leung, Kelvin K.; Wang, Hongzhi; Das, Sandhitsu R.; Yushkevich, Paul A.; Malone, Ian B.; Fox, Nick C.; Schott, Jonathan M.; Ourselin, Sebastien

    2015-01-01

    Structural MRI is widely used for investigating brain atrophy in many neurodegenerative disorders, with several research groups developing and publishing techniques to provide quantitative assessments of this longitudinal change. Often techniques are compared through computation of required sample size estimates for future clinical trials. However interpretation of such comparisons is rendered complex because, despite using the same publicly available cohorts, the various techniques have been assessed with different data exclusions and different statistical analysis models. We created the MIRIAD atrophy challenge in order to test various capabilities of atrophy measurement techniques. The data consisted of 69 subjects (46 Alzheimer's disease, 23 control) who were scanned multiple (up to twelve) times at nine visits over a follow-up period of one to two years, resulting in 708 total image sets. Nine participating groups from 6 countries completed the challenge by providing volumetric measurements of key structures (whole brain, lateral ventricle, left and right hippocampi) for each dataset and atrophy measurements of these structures for each time point pair (both forward and backward) of a given subject. From these results, we formally compared techniques using exactly the same dataset. First, we assessed the repeatability of each technique using rates obtained from short intervals where no measurable atrophy is expected. For those measures that provided direct measures of atrophy between pairs of images, we also assessed symmetry and transitivity. Then, we performed a statistical analysis in a consistent manner using linear mixed effect models. The models, one for repeated measures of volume made at multiple time-points and a second for repeated “direct” measures of change in brain volume, appropriately allowed for the correlation between measures made on the same subject and were shown to fit the data well. From these models, we obtained estimates of the distribution of atrophy rates in the Alzheimer's disease (AD) and control groups and of required sample sizes to detect a 25% treatment effect, in relation to healthy ageing, with 95% significance and 80% power over follow-up periods of 6, 12, and 24 months. Uncertainty in these estimates, and head-to-head comparisons between techniques, were carried out using the bootstrap. The lateral ventricles provided the most stable measurements, followed by the brain. The hippocampi had much more variability across participants, likely because of differences in segmentation protocol and less distinct boundaries. Most methods showed no indication of bias based on the short-term interval results, and direct measures provided good consistency in terms of symmetry and transitivity. The resulting annualized rates of change derived from the model ranged from, for whole brain: − 1.4% to − 2.2% (AD) and − 0.35% to − 0.67% (control), for ventricles: 4.6% to 10.2% (AD) and 1.2% to 3.4% (control), and for hippocampi: − 1.5% to − 7.0% (AD) and − 0.4% to − 1.4% (control). There were large and statistically significant differences in the sample size requirements between many of the techniques. The lowest sample sizes for each of these structures, for a trial with a 12 month follow-up period, were 242 (95% CI: 154 to 422) for whole brain, 168 (95% CI: 112 to 282) for ventricles, 190 (95% CI: 146 to 268) for left hippocampi, and 158 (95% CI: 116 to 228) for right hippocampi. This analysis represents one of the most extensive statistical comparisons of a large number of different atrophy measurement techniques from around the globe. The challenge data will remain online and publicly available so that other groups can assess their methods. PMID:26275383

  9. Path Integrals for Electronic Densities, Reactivity Indices, and Localization Functions in Quantum Systems

    PubMed Central

    Putz, Mihai V.

    2009-01-01

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467

  10. Path integrals for electronic densities, reactivity indices, and localization functions in quantum systems.

    PubMed

    Putz, Mihai V

    2009-11-10

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.

  11. Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism

    NASA Astrophysics Data System (ADS)

    Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor

    2018-02-01

    Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.

  12. Training the Sales Neophyte

    ERIC Educational Resources Information Center

    Harris, Clyde E., Jr.

    1975-01-01

    The article reappraises initial sales training and presents a program emphasizing objectives, responsibility for training, program content, and teaching techniques. Formal Initial Responsive Sales Training System (FIRSTS) is the name of the program explored and evaluated. (Author/MW)

  13. Weaving a Formal Methods Education with Problem-Based Learning

    NASA Astrophysics Data System (ADS)

    Gibson, J. Paul

    The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.

  14. Non-Markovian closure models for large eddy simulations using the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric J.; Duraisamy, Karthik

    2017-01-01

    This work uses the Mori-Zwanzig (M-Z) formalism, a concept originating from nonequilibrium statistical mechanics, as a basis for the development of coarse-grained models of turbulence. The mechanics of the generalized Langevin equation (GLE) are considered, and insight gained from the orthogonal dynamics equation is used as a starting point for model development. A class of subgrid models is considered which represent nonlocal behavior via a finite memory approximation [Stinis, arXiv:1211.4285 (2012)], the length of which is determined using a heuristic that is related to the spectral radius of the Jacobian of the resolved variables. The resulting models are intimately tied to the underlying numerical resolution and are capable of approximating non-Markovian effects. Numerical experiments on the Burgers equation demonstrate that the M-Z-based models can accurately predict the temporal evolution of the total kinetic energy and the total dissipation rate at varying mesh resolutions. The trajectory of each resolved mode in phase space is accurately predicted for cases where the coarse graining is moderate. Large eddy simulations (LESs) of homogeneous isotropic turbulence and the Taylor-Green Vortex show that the M-Z-based models are able to provide excellent predictions, accurately capturing the subgrid contribution to energy transfer. Last, LESs of fully developed channel flow demonstrate the applicability of M-Z-based models to nondecaying problems. It is notable that the form of the closure is not imposed by the modeler, but is rather derived from the mathematics of the coarse graining, highlighting the potential of M-Z-based techniques to define LES closures.

  15. Efficient and Scalable Cross-Matching of (Very) Large Catalogs

    NASA Astrophysics Data System (ADS)

    Pineau, F.-X.; Boch, T.; Derriere, S.

    2011-07-01

    Whether it be for building multi-wavelength datasets from independent surveys, studying changes in objects luminosities, or detecting moving objects (stellar proper motions, asteroids), cross-catalog matching is a technique widely used in astronomy. The need for efficient, reliable and scalable cross-catalog matching is becoming even more pressing with forthcoming projects which will produce huge catalogs in which astronomers will dig for rare objects, perform statistical analysis and classification, or real-time transients detection. We have developed a formalism and the corresponding technical framework to address the challenge of fast cross-catalog matching. Our formalism supports more than simple nearest-neighbor search, and handles elliptical positional errors. Scalability is improved by partitioning the sky using the HEALPix scheme, and processing independently each sky cell. The use of multi-threaded two-dimensional kd-trees adapted to managing equatorial coordinates enables efficient neighbor search. The whole process can run on a single computer, but could also use clusters of machines to cross-match future very large surveys such as GAIA or LSST in reasonable times. We already achieve performances where the 2MASS (˜470M sources) and SDSS DR7 (˜350M sources) can be matched on a single machine in less than 10 minutes. We aim at providing astronomers with a catalog cross-matching service, available on-line and leveraging on the catalogs present in the VizieR database. This service will allow users both to access pre-computed cross-matches across some very large catalogs, and to run customized cross-matching operations. It will also support VO protocols for synchronous or asynchronous queries.

  16. Statistical manifestation of quantum correlations via disequilibrium

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2017-12-01

    The statistical notion of disequilibrium (D) was introduced by López-Ruiz, Mancini, and Calbet (LMC) (1995) [1] more than 20 years ago. D measures the amount of ;correlational structure; of a system. We wish to use D to analyze one of the simplest types of quantum correlations, those present in gaseous systems due to symmetry considerations. To this end we extend the LMC formalism to the grand canonical environment and show that D displays distinctive behaviors for simple gases, that allow for interesting insights into their structural properties.

  17. Bayesian Decision Support

    NASA Astrophysics Data System (ADS)

    Berliner, M.

    2017-12-01

    Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.

  18. Small sample estimation of the reliability function for technical products

    NASA Astrophysics Data System (ADS)

    Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.

    2017-12-01

    It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.

  19. The inner mass power spectrum of galaxies using strong gravitational lensing: beyond linear approximation

    NASA Astrophysics Data System (ADS)

    Chatterjee, Saikat; Koopmans, Léon V. E.

    2018-02-01

    In the last decade, the detection of individual massive dark matter sub-haloes has been possible using potential correction formalism in strong gravitational lens imaging. Here, we propose a statistical formalism to relate strong gravitational lens surface brightness anomalies to the lens potential fluctuations arising from dark matter distribution in the lens galaxy. We consider these fluctuations as a Gaussian random field in addition to the unperturbed smooth lens model. This is very similar to weak lensing formalism and we show that in this way we can measure the power spectrum of these perturbations to the potential. We test the method by applying it to simulated mock lenses of different geometries and by performing an MCMC analysis of the theoretical power spectra. This method can measure density fluctuations in early type galaxies on scales of 1-10 kpc at typical rms levels of a per cent, using a single lens system observed with the Hubble Space Telescope with typical signal-to-noise ratios obtained in a single orbit.

  20. Statistical nature of infrared dynamics on de Sitter background

    NASA Astrophysics Data System (ADS)

    Tokuda, Junsei; Tanaka, Takahiro

    2018-02-01

    In this study, we formulate a systematic way of deriving an effective equation of motion(EoM) for long wavelength modes of a massless scalar field with a general potential V(phi) on de Sitter background, and investigate whether or not the effective EoM can be described as a classical stochastic process. Our formulation gives an extension of the usual stochastic formalism to including sub-leading secular growth coming from the nonlinearity of short wavelength modes. Applying our formalism to λ phi4 theory, we explicitly derive an effective EoM which correctly recovers the next-to-leading secularly growing part at a late time, and show that this effective EoM can be seen as a classical stochastic process. Our extended stochastic formalism can describe all secularly growing terms which appear in all correlation functions with a specific operator ordering. The restriction of the operator ordering will not be a big drawback because the commutator of a light scalar field becomes negligible at large scales owing to the squeezing.

  1. Evaluating the Effectiveness of a Large-Scale Professional Development Programme

    ERIC Educational Resources Information Center

    Main, Katherine; Pendergast, Donna

    2017-01-01

    An evaluation of the effectiveness of a large-scale professional development (PD) programme delivered to 258 schools in Queensland, Australia is presented. Formal evaluations were conducted at two stages during the programme using a tool developed from Desimone's five core features of effective PD. Descriptive statistics of 38 questions and…

  2. Making Heads or Tails of Probability: An Experiment with Random Generators

    ERIC Educational Resources Information Center

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  3. Emergent Feature Structures: Harmony Systems in Exemplar Models of Phonology

    ERIC Educational Resources Information Center

    Cole, Jennifer

    2009-01-01

    In exemplar models of phonology, phonotactic constraints are modeled as emergent from patterns of high activation between units that co-occur with statistical regularity, or as patterns of low activation or inhibition between units that co-occur less frequently or not at all. Exemplar models posit no a "priori" formal or representational…

  4. On Testability of Missing Data Mechanisms in Incomplete Data Sets

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2011-01-01

    This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…

  5. Integrated Postsecondary Education Data System Data Quality Study. Methodology Report. NCES 2005-175

    ERIC Educational Resources Information Center

    Jackson, Kenneth W.; Peecksen, Scott; Jang, Donsig; Sukasih, Amang

    2005-01-01

    The Integrated Postsecondary Education Data System (IPEDS) of the National Center for Education Statistics (NCES) was initiated in 1986 to collect data about all identified institutions whose primary purpose is to provide postsecondary education. Postsecondary education is defined within IPEDS as "the provision of a formal instructional…

  6. The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data

    ERIC Educational Resources Information Center

    Gregory, Kelvin

    2006-01-01

    The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…

  7. Bootstrapping in a Language of Thought: A Formal Model of Numerical Concept Learning

    ERIC Educational Resources Information Center

    Piantadosi, Steven T.; Tenenbaum, Joshua B.; Goodman, Noah D.

    2012-01-01

    In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful…

  8. Students and Courses 2002: At a Glance. Australian Vocational Education and Training Statistics.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    The public vocational education and training (VET) system in Australia encompasses formal learning activities intended to develop knowledge and skills that are relevant in the workplace for those past the age of compulsory schooling, but excludes bachelor and post-graduate courses and learning for leisure, recreation or personal enrichment. Some…

  9. Mechanics of Brittle Materials. Part 1. Preliminary Mechanical Properties and Statistical Representations

    DTIC Science & Technology

    1973-10-01

    intensity computation are shown in Figure 17. Using the same formal procedure outlined by Winne & Wundt . a notch geometry can be chosen to induce...Nitride at Elevated Temperatures . Winne, D.H. and Wundt , B.M., "Application of the Gnffith-Irwm Theory of Crack Propagation to the Bursting Behavior

  10. Beyond Literacy: Non-Formal Education Programmes for Adults in Mozambique

    ERIC Educational Resources Information Center

    van der Linden, Josje; Manuel, Alzira Munguambe

    2011-01-01

    Thirty-five years after independence the Mozambican illiteracy rate has been reduced from 93% to just over 50% according to official statistics. Although this indicates an enormous achievement in the area of education, the challenge of today still is to design appropriate adult basic education programmes including literacy, numeracy and life…

  11. Introduction of Digital Storytelling in Preschool Education: A Case Study from Croatia

    ERIC Educational Resources Information Center

    Preradovic, Nives Mikelic; Lesin, Gordana; Boras, Damir

    2016-01-01

    Our case study from Croatia showed the benefits of digital storytelling in a preschool as a basis for the formal ICT education. The statistical analysis revealed significant differences between children aged 6-7 who learned mathematics by traditional storytelling compared to those learning through digital storytelling. The experimental group that…

  12. 77 FR 72715 - Informal Entry Limit and Removal of a Formal Entry Requirement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... required certifications, enforcement information, and statistical data. An agency may not conduct or..., 1623, 1624, 3314. * * * * * Sec. 10.1 [Amended] 0 2. In Sec. 10.1: 0 a. Paragraph (a) introductory text... revising``19------'' to read ``20---- --''; 0 c. Paragraph (a)(2) introductory text is amended in the last...

  13. The Schrödinger–Langevin equation with and without thermal fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, R., E-mail: roland.katz@subatech.in2p3.fr; Gossiaux, P.B., E-mail: Pol-Bernard.Gossiaux@subatech.in2p3.fr

    2016-05-15

    The Schrödinger–Langevin equation (SLE) is considered as an effective open quantum system formalism suitable for phenomenological applications involving a quantum subsystem interacting with a thermal bath. We focus on two open issues relative to its solutions: the stationarity of the excited states of the non-interacting subsystem when one considers the dissipation only and the thermal relaxation toward asymptotic distributions with the additional stochastic term. We first show that a proper application of the Madelung/polar transformation of the wave function leads to a non zero damping of the excited states of the quantum subsystem. We then study analytically and numerically themore » SLE ability to bring a quantum subsystem to the thermal equilibrium of statistical mechanics. To do so, concepts about statistical mixed states and quantum noises are discussed and a detailed analysis is carried with two kinds of noise and potential. We show that within our assumptions the use of the SLE as an effective open quantum system formalism is possible and discuss some of its limitations.« less

  14. Random dopant fluctuations and statistical variability in n-channel junctionless FETs

    NASA Astrophysics Data System (ADS)

    Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.

    2018-01-01

    The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.

  15. Mid-Pliocene Planktic Foraminifer Census Data and Alkenone Unsaturation Indices from Ocean Drilling Program Hole 677A

    USGS Publications Warehouse

    Robinson, Marci; Caballero, Rocio; Pohlman, Emily; Herbert, Timothy; Peck, Victoria; Dowsett, Harry

    2008-01-01

    The U.S. Geological Survey is conducting a long-term study of mid-Pliocene climatic and oceanographic conditions. One of the key elements of the study involves the use of quantitative composition of planktic foraminifer assemblages in conjunction with other proxies to constrain estimates of sea-surface temperature (SST) and to identify major oceanographic boundaries and water masses. Raw census data are made available as soon as possible after analysis through a series of reports that provide the basic data for future work. In this report we present raw census data (table 1) for planktic foraminifer assemblages in 14 samples from Ocean Drilling Program (ODP) Hole 677A. We also present alkenone unsaturation index (UK'37) analyses for 89 samples from ODP Hole 677A (table 2). ODP Hole 677A is located in the Panama basin, due west of Ecuador at 1?12.138'N., 83?44.220'W., in 3461.2 meters of water (fig. 1). A variety of statistical methods have been developed to transform foraminiferal census data in Pliocene sequences into quantitative estimates of Pliocene SST. Details of statistical techniques, taxonomic groupings, and oceanographic interpretations are presented in more formal publications (Dowsett and Poore, 1990, 1991; Dowsett, 1991, 2007a,b; Dowsett and Robinson, 1998, 2007; Dowsett and others, 1996, 1999).

  16. Thermostatistical description of gas mixtures from space partitions

    NASA Astrophysics Data System (ADS)

    Rohrmann, R. D.; Zorec, J.

    2006-10-01

    The new mathematical framework based on the free energy of pure classical fluids presented by Rohrmann [Physica A 347, 221 (2005)] is extended to multicomponent systems to determine thermodynamic and structural properties of chemically complex fluids. Presently, the theory focuses on D -dimensional mixtures in the low-density limit (packing factor η<0.01 ). The formalism combines the free-energy minimization technique with space partitions that assign an available volume v to each particle. v is related to the closeness of the nearest neighbor and provides a useful tool to evaluate the perturbations experimented by particles in a fluid. The theory shows a close relationship between statistical geometry and statistical mechanics. New, unconventional thermodynamic variables and mathematical identities are derived as a result of the space division. Thermodynamic potentials μil , conjugate variable of the populations Nil of particles class i with the nearest neighbors of class l are defined and their relationships with the usual chemical potentials μi are established. Systems of hard spheres are treated as illustrative examples and their thermodynamics functions are derived analytically. The low-density expressions obtained agree nicely with those of scaled-particle theory and Percus-Yevick approximation. Several pair distribution functions are introduced and evaluated. Analytical expressions are also presented for hard spheres with attractive forces due to Kac-tails and square-well potentials. Finally, we derive general chemical equilibrium conditions.

  17. Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.

    PubMed

    Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk

    2012-02-01

    Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Research on Synthesis of Concurrent Computing Systems.

    DTIC Science & Technology

    1982-09-01

    20 1.5.1 An Informal Description of the Techniques ....... ..................... 20 1.5 2 Formal Definitions of Aggregation and Virtualisation ...sparsely interconnected networks . We have also developed techniques to create Kung’s systolic array parallel structure from a specification of matrix...resufts of the computation of that element. For example, if A,j is computed using a single enumeration, then virtualisation would produce a three

  19. The Development of Program for Enhancing Learning Management Competency of Teachers in Non-Formal and Informal Education Centers

    ERIC Educational Resources Information Center

    Jutasong, Chanokpon; Sirisuthi, Chaiyut; Phusri-on, Songsak

    2016-01-01

    The objectives of this research are: 1) to study factors and indicators, 2) to study current situations, desirable situations and techniques, 3) to develop the Program, and 4) to study the effect of Program. It comprised 4 phases: (1) studying the factors and indicators; (2) studying the current situations, desirable situations and techniques; (3)…

  20. Partnering

    DTIC Science & Technology

    1991-12-01

    pamplet is one in a series of pamplets describing applications of Alternative Dispute Resolution (ADR). The pamplet is part of a Corps program to...stages, or settle them prior to formal litigation. ADR is a new field, and additional techniques are being developed all the time. These pamplets are a...means of providing Corps managers with examples of how other managers have employed ADR techniques. The information in this pamplet is designed to

  1. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  2. Coaching Family Caregivers to Become Better Problem Solvers When Caring for Persons with Advanced Cancer.

    PubMed

    Dionne-Odom, J Nicholas; Lyons, Kathleen D; Akyar, Imatullah; Bakitas, Marie A

    2016-01-01

    Family caregivers of persons with advanced cancer often take on responsibilities that present daunting and complex problems. Serious problems that go unresolved may be burdensome and result in negative outcomes for caregivers' psychological and physical health and affect the quality of care delivered to the care recipients with cancer, especially at the end of life. Formal problem-solving training approaches have been developed over the past several decades to assist individuals with managing problems faced in daily life. Several of these problem-solving principles and techniques were incorporated into ENABLE (Educate, Nurture, Advise, Before Life End), an "early" palliative care telehealth intervention for individuals diagnosed with advanced cancer and their family caregivers. A hypothetical case resembling the situations of actual caregiver participants in ENABLE that exemplifies the complex problems that caregivers face is presented, followed by presentation of an overview of ENABLE's problem-solving key principles, techniques, and steps in problem-solving support. Though more research is needed to formally test the use of problem-solving support in social work practice, social workers can easily incorporate these techniques into everyday practice.

  3. Portable Wireless LAN Device and Two-Way Radio Threat Assessment for Aircraft VHF Communication Radio Band

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  4. Noise Suppression Methods for Robust Speech Processing

    DTIC Science & Technology

    1981-04-01

    1]. Techniques available for voice processor modification to account for noise contamination are being developed [4]. Preprocessor noise reduction...analysis window function. Principles governing discrete implementation of the transform pair are discussed, and relationships are formalized which specify

  5. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  6. Systematic errors in transport calculations of shear viscosity using the Green-Kubo formalism

    NASA Astrophysics Data System (ADS)

    Rose, J. B.; Torres-Rincon, J. M.; Oliinychenko, D.; Schäfer, A.; Petersen, H.

    2018-05-01

    The purpose of this study is to provide a reproducible framework in the use of the Green-Kubo formalism to extract transport coefficients. More specifically, in the case of shear viscosity, we investigate the limitations and technical details of fitting the auto-correlation function to a decaying exponential. This fitting procedure is found to be applicable for systems interacting both through constant and energy-dependent cross-sections, although this is only true for sufficiently dilute systems in the latter case. We find that the optimal fit technique consists in simultaneously fixing the intercept of the correlation function and use a fitting interval constrained by the relative error on the correlation function. The formalism is then applied to the full hadron gas, for which we obtain the shear viscosity to entropy ratio.

  7. An intermediate level of abstraction for computational systems chemistry.

    PubMed

    Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F

    2017-12-28

    Computational techniques are required for narrowing down the vast space of possibilities to plausible prebiotic scenarios, because precise information on the molecular composition, the dominant reaction chemistry and the conditions for that era are scarce. The exploration of large chemical reaction networks is a central aspect in this endeavour. While quantum chemical methods can accurately predict the structures and reactivities of small molecules, they are not efficient enough to cope with large-scale reaction systems. The formalization of chemical reactions as graph grammars provides a generative system, well grounded in category theory, at the right level of abstraction for the analysis of large and complex reaction networks. An extension of the basic formalism into the realm of integer hyperflows allows for the identification of complex reaction patterns, such as autocatalysis, in large reaction networks using optimization techniques.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  8. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  9. Control by quality: proposition of a typology.

    PubMed

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunes, Rafael C.; Abreu, Everton M.C.; Neto, Jorge Ananias

    Based on the relationship between thermodynamics and gravity we propose, with the aid of Verlinde's formalism, an alternative interpretation of the dynamical evolution of the Friedmann-Robertson-Walker Universe. This description takes into account the entropy and temperature intrinsic to the horizon of the universe due to the information holographically stored there through non-gaussian statistical theories proposed by Tsallis and Kaniadakis. The effect of these non-gaussian statistics in the cosmological context is to change the strength of the gravitational constant. In this paper, we consider the w CDM model modified by the non-gaussian statistics and investigate the compatibility of these non-gaussian modificationmore » with the cosmological observations. In order to analyze in which extend the cosmological data constrain these non-extensive statistics, we will use type Ia supernovae, baryon acoustic oscillations, Hubble expansion rate function and the linear growth of matter density perturbations data. We show that Tsallis' statistics is favored at 1σ confidence level.« less

  11. Analytical study of the effects of soft tissue artefacts on functional techniques to define axes of rotation.

    PubMed

    De Rosario, Helios; Page, Álvaro; Besa, Antonio

    2017-09-06

    The accurate location of the main axes of rotation (AoR) is a crucial step in many applications of human movement analysis. There are different formal methods to determine the direction and position of the AoR, whose performance varies across studies, depending on the pose and the source of errors. Most methods are based on minimizing squared differences between observed and modelled marker positions or rigid motion parameters, implicitly assuming independent and uncorrelated errors, but the largest error usually results from soft tissue artefacts (STA), which do not have such statistical properties and are not effectively cancelled out by such methods. However, with adequate methods it is possible to assume that STA only account for a small fraction of the observed motion and to obtain explicit formulas through differential analysis that relate STA components to the resulting errors in AoR parameters. In this paper such formulas are derived for three different functional calibration techniques (Geometric Fitting, mean Finite Helical Axis, and SARA), to explain why each technique behaves differently from the others, and to propose strategies to compensate for those errors. These techniques were tested with published data from a sit-to-stand activity, where the true axis was defined using bi-planar fluoroscopy. All the methods were able to estimate the direction of the AoR with an error of less than 5°, whereas there were errors in the location of the axis of 30-40mm. Such location errors could be reduced to less than 17mm by the methods based on equations that use rigid motion parameters (mean Finite Helical Axis, SARA) when the translation component was calculated using the three markers nearest to the axis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Data-based Non-Markovian Model Inference

    NASA Astrophysics Data System (ADS)

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close collaboration with M.D. Chekroun, D. Kondrashov, S. Kravtsov and A.W. Robertson.

  13. Uncertainty and inference in the world of paleoecological data

    NASA Astrophysics Data System (ADS)

    McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.

    2017-12-01

    Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a coarseness in statistical models of taphonomic process. Each of these features provides useful opportunities for statisticians and data-generating researchers to assess what we know about the signal and the noise in paleo data and to improve inference about past changes in ecosystem state.

  14. Electron transport in ultra-thin films and ballistic electron emission microscopy

    NASA Astrophysics Data System (ADS)

    Claveau, Y.; Di Matteo, S.; de Andres, P. L.; Flores, F.

    2017-03-01

    We have developed a calculation scheme for the elastic electron current in ultra-thin epitaxial heterostructures. Our model uses a Keldysh’s non-equilibrium Green’s function formalism and a layer-by-layer construction of the epitaxial film. Such an approach is appropriate to describe the current in a ballistic electron emission microscope (BEEM) where the metal base layer is ultra-thin and generalizes a previous one based on a decimation technique appropriated for thick slabs. This formalism allows a full quantum mechanical description of the transmission across the epitaxial heterostructure interface, including multiple scattering via the Dyson equation, which is deemed a crucial ingredient to describe interfaces of ultra-thin layers properly in the future. We introduce a theoretical formulation needed for ultra-thin layers and we compare with results obtained for thick Au(1 1 1) metal layers. An interesting effect takes place for a width of about ten layers: a BEEM current can propagate via the center of the reciprocal space (\\overlineΓ ) along the Au(1 1 1) direction. We associate this current to a coherent interference finite-width effect that cannot be found using a decimation technique. Finally, we have tested the validity of the handy semiclassical formalism to describe the BEEM current.

  15. Simple approach to sediment provenance tracing using element analysis and fundamental principles

    NASA Astrophysics Data System (ADS)

    Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan

    2016-04-01

    Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes slightly but systematically element ratios in solids, (3) the geochemistry and fates of the finest particles, neoformed by weathering and reactive during transport and storage in fluvial system, are different than those of the parent material and its less mature coarse weathering products, and (4) most inter-element ratios and some grain-size effects are non-linear that endanger assumption on additivity of properties in components mixing. We are aware we offer only a conceptual model and not a novel algorithm for quantification of sediment sources, which could be tested in practical studies. On the other hand, we consider element fractionation by exogenic processes fascinating as they are poorly described but relevant not only for provenance tracing but also for general environmental geochemistry.

  16. Hunting high and low: disentangling primordial and late-time non-Gaussianity with cosmic densities in spheres

    NASA Astrophysics Data System (ADS)

    Uhlemann, C.; Pajer, E.; Pichon, C.; Nishimichi, T.; Codis, S.; Bernardeau, F.

    2018-03-01

    Non-Gaussianities of dynamical origin are disentangled from primordial ones using the formalism of large deviation statistics with spherical collapse dynamics. This is achieved by relying on accurate analytical predictions for the one-point probability distribution function and the two-point clustering of spherically averaged cosmic densities (sphere bias). Sphere bias extends the idea of halo bias to intermediate density environments and voids as underdense regions. In the presence of primordial non-Gaussianity, sphere bias displays a strong scale dependence relevant for both high- and low-density regions, which is predicted analytically. The statistics of densities in spheres are built to model primordial non-Gaussianity via an initial skewness with a scale dependence that depends on the bispectrum of the underlying model. The analytical formulas with the measured non-linear dark matter variance as input are successfully tested against numerical simulations. For local non-Gaussianity with a range from fNL = -100 to +100, they are found to agree within 2 per cent or better for densities ρ ∈ [0.5, 3] in spheres of radius 15 Mpc h-1 down to z = 0.35. The validity of the large deviation statistics formalism is thereby established for all observationally relevant local-type departures from perfectly Gaussian initial conditions. The corresponding estimators for the amplitude of the non-linear variance σ8 and primordial skewness fNL are validated using a fiducial joint maximum likelihood experiment. The influence of observational effects and the prospects for a future detection of primordial non-Gaussianity from joint one- and two-point densities-in-spheres statistics are discussed.

  17. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  18. Statistical mechanics of shell models for two-dimensional turbulence

    NASA Astrophysics Data System (ADS)

    Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.

    1994-12-01

    We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.

  19. Dietary diversity of formal and informal residents in Johannesburg, South Africa

    PubMed Central

    2013-01-01

    Background This paper considers the question of dietary diversity as a proxy for nutrition insecurity in communities living in the inner city and the urban informal periphery in Johannesburg. It argues that the issue of nutrition insecurity demands urgent and immediate attention by policy makers. Methods A cross-sectional survey was undertaken for households from urban informal (n = 195) and urban formal (n = 292) areas in Johannesburg, South Africa. Foods consumed by the respondents the previous day were used to calculate a Dietary Diversity Score; a score < 4 was considered low. Results Statistical comparisons of means between groups revealed that respondents from informal settlements consumed mostly cereals and meat/poultry/fish, while respondents in formal settlements consumed a more varied diet. Significantly more respondents living in informal settlements consumed a diet of low diversity (68.1%) versus those in formal settlements (15.4%). When grouped in quintiles, two-thirds of respondents from informal settlements fell in the lowest two, versus 15.4% living in formal settlements. Households who experienced periods of food shortages during the previous 12 months had a lower mean DDS than those from food secure households (4.00 ± 1.6 versus 4.36 ± 1.7; p = 0.026). Conclusions Respondents in the informal settlements were more nutritionally vulnerable. Achieving nutrition security requires policies, strategies and plans to include specific nutrition considerations. PMID:24088249

  20. What Sensing Tells Us: Towards a Formal Theory of Testing for Dynamical Systems

    NASA Technical Reports Server (NTRS)

    McIlraith, Sheila; Scherl, Richard

    2005-01-01

    Just as actions can have indirect effects on the state of the world, so too can sensing actions have indirect effects on an agent's state of knowledge. In this paper, we investigate "what sensing actions tell us", i.e., what an agent comes to know indirectly from the outcome of a sensing action, given knowledge of its actions and state constraints that hold in the world. To this end, we propose a formalization of the notion of testing within a dialect of the situation calculus that includes knowledge and sensing actions. Realizing this formalization requires addressing the ramification problem for sensing actions. We formalize simple tests as sensing actions. Complex tests are expressed in the logic programming language Golog. We examine what it means to perform a test, and how the outcome of a test affects an agent's state of knowledge. Finally, we propose automated reasoning techniques for test generation and complex-test verification, under certain restrictions. The work presented in this paper is relevant to a number of application domains including diagnostic problem solving, natural language understanding, plan recognition, and active vision.

  1. High-order cyclo-difference techniques: An alternative to finite differences

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Otto, John C.

    1993-01-01

    The summation-by-parts energy norm is used to establish a new class of high-order finite-difference techniques referred to here as 'cyclo-difference' techniques. These techniques are constructed cyclically from stable subelements, and require no numerical boundary conditions; when coupled with the simultaneous approximation term (SAT) boundary treatment, they are time asymptotically stable for an arbitrary hyperbolic system. These techniques are similar to spectral element techniques and are ideally suited for parallel implementation, but do not require special collocation points or orthogonal basis functions. The principal focus is on methods of sixth-order formal accuracy or less; however, these methods could be extended in principle to any arbitrary order of accuracy.

  2. A Microworld Approach to the Formalization of Musical Knowledge.

    ERIC Educational Resources Information Center

    Honing, Henkjan

    1993-01-01

    Discusses the importance of applying computational modeling and artificial intelligence techniques to music cognition and computer music research. Recommends three uses of microworlds to trim computational theories to their bare minimum, allowing for better and easier comparison. (CFR)

  3. International NMR-based Environmental Metabolomics Intercomparison Exercise

    EPA Science Inventory

    Several fundamental requirements must be met so that NMR-based metabolomics and the related technique of metabonomics can be formally adopted into environmental monitoring and chemical risk assessment. Here we report an intercomparison exercise which has evaluated the effectivene...

  4. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions... two value engineering approaches: (1) The first is an incentive approach in which contractor...

  5. On acquisition of programming knowledge

    NASA Technical Reports Server (NTRS)

    Amin, Ashok T.

    1987-01-01

    For the evolving discipline of programming, acquisition of programming knowledge is a difficult issue. Common knowledge results from the acceptance of proven techniques based on results of formal inquiries into the nature of the programming process. This is a rather slow process. In addition, the vast body of common knowledge needs to be explicated to a low enough level of details for it to be represented in the machine processable form. It is felt that this is an impediment to the progress of automatic programming. The importance of formal approaches cannot be overstated since their contributions lead to quantum leaps in the state of the art.

  6. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  7. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  8. Perturbation theory of nuclear matter with a microscopic effective interaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benhar, Omar; Lovato, Alessandro

    Here, an updated and improved version of the effective interaction based on the Argonne-Urbana nuclear Hamiltonian, derived using the formalism of correlated basis functions and the cluster expansion technique, is employed to obtain a number of properties of cold nuclear matter at arbitrary neutron excess within the formalism of many-body perturbation theory. The numerical results, including the ground-state energy per nucleon, the symmetry energy, the pressure, the compressibility, and the single-particle spectrum, are discussed in the context of the available empirical information, obtained from measured nuclear properties and heavy-ion collisions.

  9. Perturbation theory of nuclear matter with a microscopic effective interaction

    DOE PAGES

    Benhar, Omar; Lovato, Alessandro

    2017-11-01

    Here, an updated and improved version of the effective interaction based on the Argonne-Urbana nuclear Hamiltonian, derived using the formalism of correlated basis functions and the cluster expansion technique, is employed to obtain a number of properties of cold nuclear matter at arbitrary neutron excess within the formalism of many-body perturbation theory. The numerical results, including the ground-state energy per nucleon, the symmetry energy, the pressure, the compressibility, and the single-particle spectrum, are discussed in the context of the available empirical information, obtained from measured nuclear properties and heavy-ion collisions.

  10. Exploring High School Students Beginning Reasoning about Significance Tests with Technology

    ERIC Educational Resources Information Center

    García, Víctor N.; Sánchez, Ernesto

    2017-01-01

    In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…

  11. An Introduction to Distributions Using Weighted Dice

    ERIC Educational Resources Information Center

    Holland, Bart K.

    2011-01-01

    Distributions are the basis for an enormous amount of theoretical and applied work in statistics. While there are formal definitions of distributions and many formulas to characterize them, it is important that students at first get a clear introduction to this basic concept. For many of them, neither words nor formulas can match the power of a…

  12. Uncertainty in eddy covariance measurements and its application to physiological models

    Treesearch

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  13. How Can We Enhance Enjoyment of Secondary School? The Student View

    ERIC Educational Resources Information Center

    Gorard, Stephen; See, Beng Huat

    2011-01-01

    This paper considers enjoyment of formal education for young people aged 14 to 16, largely from their own perspective, based on the view of around 3000 students in England. The data include documentary analysis, official statistics, interviews and surveys with staff and students. Enjoyment of school tends to be promoted by factors such as…

  14. Empirical and Genealogical Analysis of Non-Vocational Adult Education in Europe

    ERIC Educational Resources Information Center

    Manninen, Jyri

    2017-01-01

    Non-formal, non-vocational adult education (NFNVAE) is a low-cost, low-threshold learning activity that generates many benefits for individuals and society, and it should play a more central role in educational policy. NFNVAE's challenge is that it lacks clear concepts and definitions and is, therefore, less systematically covered in statistics,…

  15. A Statistical Ontology-Based Approach to Ranking for Multiword Search

    ERIC Educational Resources Information Center

    Kim, Jinwoo

    2013-01-01

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…

  16. Reliability Considerations for the Operation of Large Accelerator User Facilities

    DOE PAGES

    Willeke, F. J.

    2016-01-29

    The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. Finally, the article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.

  17. A Formal Derivation of the Gibbs Entropy for Classical Systems Following the Schrodinger Quantum Mechanical Approach

    ERIC Educational Resources Information Center

    Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.

    2008-01-01

    In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…

  18. Peer Coaching as an Institutionalised Tool for Professional Development: The Perceptions of Tutors in a Nigerian College

    ERIC Educational Resources Information Center

    Aderibigbe, Semiyu Adejare; Ajasa, Folorunso Adekemi

    2013-01-01

    Purpose: The purpose of this paper is to explore the perceptions of college tutors on peer coaching as a tool for professional development to determine its formal institutionalisation. Design/methodology/approach: A survey questionnaire was used for data collection, while analysis of data was done using descriptive statistics. Findings: The…

  19. Open Educational Resources: A Faculty Author's Perspective

    ERIC Educational Resources Information Center

    Illowsky, Barbara

    2012-01-01

    As the coauthor (with Susan Dean) of a formally for-profit and now open (i.e., free on the web) textbook, "Collaborative Statistics," this author has received many questions about open educational resources (OER), which can be summarized as follows: (1) What are OER?; (2) Why do you support, actively promote, and speak about OER?; (3) If a book is…

  20. 76 FR 30306 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-25

    ...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee on June 14-15, 2011 to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.

  1. 76 FR 43266 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee, on August 9-10, 2011, to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.

  2. Standing by Their Principles: Two Librarians Who Faced Challenges

    ERIC Educational Resources Information Center

    Adams, Helen; Leu, DaNae; Venuto, Dee Ann

    2015-01-01

    What do school librarians fear most? Hands down, their biggest fear is a formal challenge to a resource in the school library. There are no accurate statistics about the number of challenges to school library resources. The staff of ALA's Office for Intellectual Freedom estimates that only about 20 percent are reported to ALA annually. For the…

  3. Exploring the Implementation, Effectiveness and Costs of the Reading Partners Program

    ERIC Educational Resources Information Center

    Jacob, Robin; Elson, Dean; Bowden, Brooks; Armstrong, Catherine

    2015-01-01

    Reading skills are the key building blocks of a child's formal education. Yet, the national statistics on literacy attainment are profoundly distressing: two out of three American fourth graders are reading below grade level and almost one third of children nationwide lack even basic reading skills. This study reports on an evaluation of the…

  4. 76 FR 66875 - Informal Entry Limit and Removal of a Formal Entry Requirement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-28

    ... to properly assess duties on the merchandise and collect accurate statistics with respect to the.... In Sec. 10.1: a. Introductory paragraph (a) is amended by removing the word ``shall'' and adding in... removing the word ``shall'' and adding in its place the word ``must''; m. Introductory paragraph (h)(4) is...

  5. Integrating reasoning and clinical archetypes using OWL ontologies and SWRL rules.

    PubMed

    Lezcano, Leonardo; Sicilia, Miguel-Angel; Rodríguez-Solano, Carlos

    2011-04-01

    Semantic interoperability is essential to facilitate the computerized support for alerts, workflow management and evidence-based healthcare across heterogeneous electronic health record (EHR) systems. Clinical archetypes, which are formal definitions of specific clinical concepts defined as specializations of a generic reference (information) model, provide a mechanism to express data structures in a shared and interoperable way. However, currently available archetype languages do not provide direct support for mapping to formal ontologies and then exploiting reasoning on clinical knowledge, which are key ingredients of full semantic interoperability, as stated in the SemanticHEALTH report [1]. This paper reports on an approach to translate definitions expressed in the openEHR Archetype Definition Language (ADL) to a formal representation expressed using the Ontology Web Language (OWL). The formal representations are then integrated with rules expressed with Semantic Web Rule Language (SWRL) expressions, providing an approach to apply the SWRL rules to concrete instances of clinical data. Sharing the knowledge expressed in the form of rules is consistent with the philosophy of open sharing, encouraged by archetypes. Our approach also allows the reuse of formal knowledge, expressed through ontologies, and extends reuse to propositions of declarative knowledge, such as those encoded in clinical guidelines. This paper describes the ADL-to-OWL translation approach, describes the techniques to map archetypes to formal ontologies, and demonstrates how rules can be applied to the resulting representation. We provide examples taken from a patient safety alerting system to illustrate our approach. Copyright © 2010 Elsevier Inc. All rights reserved.

  6. Stochastic Game Analysis and Latency Awareness for Self-Adaptation

    DTIC Science & Technology

    2014-01-01

    this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to

  7. Pinch technique and the Batalin-Vilkovisky formalism

    NASA Astrophysics Data System (ADS)

    Binosi, Daniele; Papavassiliou, Joannis

    2002-07-01

    In this paper we take the first step towards a nondiagrammatic formulation of the pinch technique. In particular we proceed into a systematic identification of the parts of the one-loop and two-loop Feynman diagrams that are exchanged during the pinching process in terms of unphysical ghost Green's functions; the latter appear in the standard Slavnov-Taylor identity satisfied by the tree-level and one-loop three-gluon vertex. This identification allows for the consistent generalization of the intrinsic pinch technique to two loops, through the collective treatment of entire sets of diagrams, instead of the laborious algebraic manipulation of individual graphs, and sets up the stage for the generalization of the method to all orders. We show that the task of comparing the effective Green's functions obtained by the pinch technique with those computed in the background field method Feynman gauge is significantly facilitated when employing the powerful quantization framework of Batalin and Vilkovisky. This formalism allows for the derivation of a set of useful nonlinear identities, which express the background field method Green's functions in terms of the conventional (quantum) ones and auxiliary Green's functions involving the background source and the gluonic antifield; these latter Green's functions are subsequently related by means of a Schwinger-Dyson type of equation to the ghost Green's functions appearing in the aforementioned Slavnov-Taylor identity.

  8. Quantum-Like Models for Decision Making in Psychology and Cognitive Science

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2009-02-01

    We show that (in contrast to rather common opinion) the domain of applications of the mathematical formalism of quantum mechanics is not restricted to physics. This formalism can be applied to the description of various quantum-like (QL) information processing. In particular, the calculus of quantum (and more general QL) probabilities can be used to explain some paradoxical statistical data which was collected in psychology and cognitive science. The main lesson of our study is that one should sharply distinguish the mathematical apparatus of QM from QM as a physical theory. The domain of application of the mathematical apparatus is essentially wider than quantum physics. Quantum-like representation algorithm, formula of total probability, interference of probabilities, psychology, cognition, decision making.

  9. Formal faculty observation and assessment of bedside skills for 3rd-year neurology clerks

    PubMed Central

    Mooney, Christopher; Wexler, Erika; Mink, Jonathan; Post, Jennifer; Jozefowicz, Ralph F.

    2016-01-01

    Objective: To evaluate the feasibility and utility of instituting a formalized bedside skills evaluation (BSE) for 3rd-year medical students on the neurology clerkship. Methods: A neurologic BSE was developed for 3rd-year neurology clerks at the University of Rochester for the 2012–2014 academic years. Faculty directly observed 189 students completing a full history and neurologic examination on real inpatients. Mock grades were calculated utilizing the BSE in the final grade, and number of students with a grade difference was determined when compared to true grade. Correlation was explored between the BSE and clinical scores, National Board of Medical Examiners (NBME) scores, case complexity, and true final grades. A survey was administered to students to assess their clinical skills exposure and the usefulness of the BSE. Results: Faculty completed and submitted a BSE form for 88.3% of students. There was a mock final grade change for 13.2% of students. Correlation coefficients between BSE score and clinical score/NBME score were 0.36 and 0.35, respectively. A statistically significant effect of BSE was found on final clerkship grade (F2,186 = 31.9, p < 0.0001). There was no statistical difference between BSE score and differing case complexities. Conclusions: Incorporating a formal faculty-observed BSE into the 3rd year neurology clerkship was feasible. Low correlation between BSE score and other evaluations indicated a unique measurement to contribute to student grade. Using real patients with differing case complexity did not alter the grade. PMID:27770072

  10. Insight into structural phase transitions from the decoupled anharmonic mode approximation

    NASA Astrophysics Data System (ADS)

    Adams, Donat J.; Passerone, Daniele

    2016-08-01

    We develop a formalism (decoupled anharmonic mode approximation, DAMA) that allows calculation of the vibrational free energy using density functional theory even for materials which exhibit negative curvature of the potential energy surface with respect to atomic displacements. We investigate vibrational modes beyond the harmonic approximation and approximate the potential energy surface with the superposition of the accurate potential along each normal mode. We show that the free energy can stabilize crystal structures at finite temperatures which appear dynamically unstable at T  =  0. The DAMA formalism is computationally fast because it avoids statistical sampling through molecular dynamics calculations, and is in principle completely ab initio. It is free of statistical uncertainties and independent of model parameters, but can give insight into the mechanism of a structural phase transition. We apply the formalism to the perovskite cryolite, and investigate the temperature-driven phase transition from the P21/n to the Immm space group. We calculate a phase transition temperature between 710 and 950 K, in fair agreement with the experimental value of 885 K. This can be related to the underestimation of the interaction of the vibrational states. We also calculate the main axes of the thermal ellipsoid and can explain the experimentally observed increase of its volume for the fluorine by 200-300% throughout the phase transition. Our calculations suggest the appearance of tunneling states in the high temperature phase. The convergence of the vibrational DOS and of the critical temperature with respect of reciprocal space sampling is investigated using the polarizable-ion model.

  11. Insight into structural phase transitions from the decoupled anharmonic mode approximation.

    PubMed

    Adams, Donat J; Passerone, Daniele

    2016-08-03

    We develop a formalism (decoupled anharmonic mode approximation, DAMA) that allows calculation of the vibrational free energy using density functional theory even for materials which exhibit negative curvature of the potential energy surface with respect to atomic displacements. We investigate vibrational modes beyond the harmonic approximation and approximate the potential energy surface with the superposition of the accurate potential along each normal mode. We show that the free energy can stabilize crystal structures at finite temperatures which appear dynamically unstable at T  =  0. The DAMA formalism is computationally fast because it avoids statistical sampling through molecular dynamics calculations, and is in principle completely ab initio. It is free of statistical uncertainties and independent of model parameters, but can give insight into the mechanism of a structural phase transition. We apply the formalism to the perovskite cryolite, and investigate the temperature-driven phase transition from the P21/n to the Immm space group. We calculate a phase transition temperature between 710 and 950 K, in fair agreement with the experimental value of 885 K. This can be related to the underestimation of the interaction of the vibrational states. We also calculate the main axes of the thermal ellipsoid and can explain the experimentally observed increase of its volume for the fluorine by 200-300% throughout the phase transition. Our calculations suggest the appearance of tunneling states in the high temperature phase. The convergence of the vibrational DOS and of the critical temperature with respect of reciprocal space sampling is investigated using the polarizable-ion model.

  12. Formal faculty observation and assessment of bedside skills for 3rd-year neurology clerks.

    PubMed

    Thompson Stone, Robert; Mooney, Christopher; Wexler, Erika; Mink, Jonathan; Post, Jennifer; Jozefowicz, Ralph F

    2016-11-22

    To evaluate the feasibility and utility of instituting a formalized bedside skills evaluation (BSE) for 3rd-year medical students on the neurology clerkship. A neurologic BSE was developed for 3rd - year neurology clerks at the University of Rochester for the 2012-2014 academic years. Faculty directly observed 189 students completing a full history and neurologic examination on real inpatients. Mock grades were calculated utilizing the BSE in the final grade, and number of students with a grade difference was determined when compared to true grade. Correlation was explored between the BSE and clinical scores, National Board of Medical Examiners (NBME) scores, case complexity, and true final grades. A survey was administered to students to assess their clinical skills exposure and the usefulness of the BSE. Faculty completed and submitted a BSE form for 88.3% of students. There was a mock final grade change for 13.2% of students. Correlation coefficients between BSE score and clinical score/NBME score were 0.36 and 0.35, respectively. A statistically significant effect of BSE was found on final clerkship grade (F 2,186 = 31.9, p < 0.0001). There was no statistical difference between BSE score and differing case complexities. Incorporating a formal faculty-observed BSE into the 3rd year neurology clerkship was feasible. Low correlation between BSE score and other evaluations indicated a unique measurement to contribute to student grade. Using real patients with differing case complexity did not alter the grade. © 2016 American Academy of Neurology.

  13. Quantum Behavior of an Autonomous Maxwell Demon

    NASA Astrophysics Data System (ADS)

    Chapman, Adrian; Miyake, Akimasa

    2015-03-01

    A Maxwell Demon is an agent that can exploit knowledge of a system's microstate to perform useful work. The second law of thermodynamics is only recovered upon taking into account the work required to irreversibly update the demon's memory, bringing information theoretic concepts into a thermodynamic framework. Recently, there has been interest in modeling a classical Maxwell demon as an autonomous physical system to study this information-work tradeoff explicitly. Motivated by the idea that states with non-local entanglement structure can be used as a computational resource, we ask whether these states have thermodynamic resource quality as well by generalizing a particular classical autonomous Maxwell demon to the quantum regime. We treat the full quantum description using a matrix product operator formalism, which allows us to handle quantum and classical correlations in a unified framework. Applying this, together with techniques from statistical mechanics, we are able to approximate nonlocal quantities such as the erasure performed on the demon's memory register when correlations are present. Finally, we examine how the demon may use these correlations as a resource to outperform its classical counterpart.

  14. Rich analysis and rational models: Inferring individual behavior from infant looking data

    PubMed Central

    Piantadosi, Steven T.; Kidd, Celeste; Aslin, Richard

    2013-01-01

    Studies of infant looking times over the past 50 years have provided profound insights about cognitive development, but their dependent measures and analytic techniques are quite limited. In the context of infants' attention to discrete sequential events, we show how a Bayesian data analysis approach can be combined with a rational cognitive model to create a rich data analysis framework for infant looking times. We formalize (i) a statistical learning model (ii) a parametric linking between the learning model's beliefs and infants' looking behavior, and (iii) a data analysis model that infers parameters of the cognitive model and linking function for groups and individuals. Using this approach, we show that recent findings from Kidd, Piantadosi, and Aslin (2012) of a U-shaped relationship between look-away probability and stimulus complexity even holds within infants and is not due to averaging subjects with different types of behavior. Our results indicate that individual infants prefer stimuli of intermediate complexity, reserving attention for events that are moderately predictable given their probabilistic expectations about the world. PMID:24750256

  15. Rich analysis and rational models: inferring individual behavior from infant looking data.

    PubMed

    Piantadosi, Steven T; Kidd, Celeste; Aslin, Richard

    2014-05-01

    Studies of infant looking times over the past 50 years have provided profound insights about cognitive development, but their dependent measures and analytic techniques are quite limited. In the context of infants' attention to discrete sequential events, we show how a Bayesian data analysis approach can be combined with a rational cognitive model to create a rich data analysis framework for infant looking times. We formalize (i) a statistical learning model, (ii) a parametric linking between the learning model's beliefs and infants' looking behavior, and (iii) a data analysis approach and model that infers parameters of the cognitive model and linking function for groups and individuals. Using this approach, we show that recent findings from Kidd, Piantadosi and Aslin (iv) of a U-shaped relationship between look-away probability and stimulus complexity even holds within infants and is not due to averaging subjects with different types of behavior. Our results indicate that individual infants prefer stimuli of intermediate complexity, reserving attention for events that are moderately predictable given their probabilistic expectations about the world. © 2014 John Wiley & Sons Ltd.

  16. Wiener filtering of the COBE Differential Microwave Radiometer data

    NASA Technical Reports Server (NTRS)

    Bunn, Emory F.; Fisher, Karl B.; Hoffman, Yehuda; Lahav, Ofer; Silk, Joseph; Zaroubi, Saleem

    1994-01-01

    We derive an optimal linear filter to suppress the noise from the cosmic background explorer satellite (COBE) Differential Microwave Radiometer (DMR) sky maps for a given power spectrum. We then apply the filter to the first-year DMR data, after removing pixels within 20 deg of the Galactic plane from the data. We are able to identify particular hot and cold spots in the filtered maps at a level 2 to 3 times the noise level. We use the formalism of constrained realizations of Gaussian random fields to assess the uncertainty in the filtered sky maps. In addition to improving the signal-to-noise ratio of the map as a whole, these techniques allow us to recover some information about the cosmic microwave background anisotropy in the missing Galactic plane region. From these maps we are able to determine which hot and cold spots in the data are statistically significant, and which may have been produced by noise. In addition, the filtered maps can be used for comparison with other experiments on similar angular scales.

  17. Solution influence on biomolecular equilibria - Nucleic acid base associations

    NASA Technical Reports Server (NTRS)

    Pohorille, A.; Pratt, L. R.; Burt, S. K.; Macelroy, R. D.

    1984-01-01

    Various attempts to construct an understanding of the influence of solution environment on biomolecular equilibria at the molecular level using computer simulation are discussed. First, the application of the formal statistical thermodynamic program for investigating biomolecular equilibria in solution is presented, addressing modeling and conceptual simplications such as perturbative methods, long-range interaction approximations, surface thermodynamics, and hydration shell. Then, Monte Carlo calculations on the associations of nucleic acid bases in both polar and nonpolar solvents such as water and carbon tetrachloride are carried out. The solvent contribution to the enthalpy of base association is positive (destabilizing) in both polar and nonpolar solvents while negative enthalpies for stacked complexes are obtained only when the solute-solute in vacuo energy is added to the total energy. The release upon association of solvent molecules from the first hydration layer around a solute to the bulk is accompanied by an increase in solute-solvent energy and decrease in solvent-solvent energy. The techniques presented are expectd to displace less molecular and more heuristic modeling of biomolecular equilibria in solution.

  18. Characterizing the information content of cloud thermodynamic phase retrievals from the notional PACE OCI shortwave reflectance measurements

    NASA Astrophysics Data System (ADS)

    Coddington, O. M.; Vukicevic, T.; Schmidt, K. S.; Platnick, S.

    2017-08-01

    We rigorously quantify the probability of liquid or ice thermodynamic phase using only shortwave spectral channels specific to the National Aeronautics and Space Administration's Moderate Resolution Imaging Spectroradiometer, Visible Infrared Imaging Radiometer Suite, and the notional future Plankton, Aerosol, Cloud, ocean Ecosystem imager. The results show that two shortwave-infrared channels (2135 and 2250 nm) provide more information on cloud thermodynamic phase than either channel alone; in one case, the probability of ice phase retrieval increases from 65 to 82% by combining 2135 and 2250 nm channels. The analysis is performed with a nonlinear statistical estimation approach, the GEneralized Nonlinear Retrieval Analysis (GENRA). The GENRA technique has previously been used to quantify the retrieval of cloud optical properties from passive shortwave observations, for an assumed thermodynamic phase. Here we present the methodology needed to extend the utility of GENRA to a binary thermodynamic phase space (i.e., liquid or ice). We apply formal information content metrics to quantify our results; two of these (mutual and conditional information) have not previously been used in the field of cloud studies.

  19. Topics in Computational Bayesian Statistics With Applications to Hierarchical Models in Astronomy and Sociology

    NASA Astrophysics Data System (ADS)

    Sahai, Swupnil

    This thesis includes three parts. The overarching theme is how to analyze structured hierarchical data, with applications to astronomy and sociology. The first part discusses how expectation propagation can be used to parallelize the computation when fitting big hierarchical bayesian models. This methodology is then used to fit a novel, nonlinear mixture model to ultraviolet radiation from various regions of the observable universe. The second part discusses how the Stan probabilistic programming language can be used to numerically integrate terms in a hierarchical bayesian model. This technique is demonstrated on supernovae data to significantly speed up convergence to the posterior distribution compared to a previous study that used a Gibbs-type sampler. The third part builds a formal latent kernel representation for aggregate relational data as a way to more robustly estimate the mixing characteristics of agents in a network. In particular, the framework is applied to sociology surveys to estimate, as a function of ego age, the age and sex composition of the personal networks of individuals in the United States.

  20. Integration of expert knowledge and uncertainty in natural risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko; Jaboyedoff, Michel

    2010-05-01

    Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.

  1. The quality management journey: the progress of health facilities in Australia.

    PubMed

    Carr, B J

    1994-12-01

    Many facilities in Australia have taken the Total Quality Management (TQM) step. The objective of this study was to examine progress of adopted formal quality systems in health. Sixty per cent of organizations surveyed have adopted formal systems. Of these, Deming adherents are the most common, followed by eclectic choices. Only 35% considered the quality transition as reasonably easy. There was no relationship between accreditation and formal quality systems identified. The most common improvement techniques were: flow charts, histograms, and cause and effect diagrams. Quality practitioners are happy to use several tools exceptionally well rather than have many tools at their disposal. The greatest impediment to the adoption of quality was the lack of top management support. This study did not support the view that clinicians are not readily actively supporting quality initiatives. Total Quality Management is not a mature concept; however, Chief Executive Officers are assured that rewards will be realized over time.

  2. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  3. Petri Nets - A Mathematical Formalism to Analyze Chemical Reaction Networks.

    PubMed

    Koch, Ina

    2010-12-17

    In this review we introduce and discuss Petri nets - a mathematical formalism to describe and analyze chemical reaction networks. Petri nets were developed to describe concurrency in general systems. We find most applications to technical and financial systems, but since about twenty years also in systems biology to model biochemical systems. This review aims to give a short informal introduction to the basic formalism illustrated by a chemical example, and to discuss possible applications to the analysis of chemical reaction networks, including cheminformatics. We give a short overview about qualitative as well as quantitative modeling Petri net techniques useful in systems biology, summarizing the state-of-the-art in that field and providing the main literature references. Finally, we discuss advantages and limitations of Petri nets and give an outlook to further development. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Perceptions of registered nurses in four state health insititutions on continuing formal education.

    PubMed

    Richards, L; Potgieter, E

    2010-06-01

    This study investigated registered nurses in four selected state health institutions' perceptions with regard to continuing formal education. The relevance of continuing formal education is being emphasised globally by the increasing quest for quality assurance and quality management systems within an ethos of continuous improvement. According to Tlholoe (2006:5), it is important to be committed to continual learning, as people's knowledge become less relevant because skills gained early in a career are insufficient to avoid costly mistakes made through ignorance. Continuing formal education in nursing is a key element to the maintenance of quality in health care delivery. The study described: registered nurses' views on continuing formal education. Registered nurses' perceived barriers to continuing formal education. A quantitative descriptive survey design was chosen using a questionnaire for data collection. The sample consisted of 40 registered nurses working at four state health institutions in the Western Cape Province, South Africa. Convenience sampling was selected to include registered nurses who were on duty on the days during which the researcher visited the health institutions to distribute the questionnaires. The questionnaire contained mainly closed-ended and a few open-ended questions. Content validity of the instrument was ensured by doing a thorough literature review before construction of items and a pretest. Reliability was established by the pretest and providing the same information to all respondents before completion of the questionnaires. The ethical considerations of informed consent, anonymity and confidentiality were adhered to and consent to conduct the study was obtained from relevant authorities. Descriptive statistics, based on calculations using the Microsoft (MS) Excel (for Windows 2000) programme, were used to summarise and describe the research results. The research results indicated that most registered nurses perceive continuing formal education as beneficial to their personal and professional growth and that it could lead towards improving the quality of patient/client care, but barriers exist which prevent or deter them from undertaking continuing formal education programmes. The main structural barriers included lack of funding and lack of coherent staff development planning and physical barriers including job and family responsibilities.

  5. Mathematics and Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, Gerald

    1979-01-01

    Examines the main mathematical approaches to information retrieval, including both algebraic and probabilistic models, and describes difficulties which impede formalization of information retrieval processes. A number of developments are covered where new theoretical understandings have directly led to improved retrieval techniques and operations.…

  6. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  7. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  8. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  9. Prevalence of peritonitis and mortality in patients treated with continuous ambulatory peritoneal dialysis (CAPD) in Africa: a protocol for a systematic review and meta-analysis.

    PubMed

    Moloi, Mothusi Walter; Kajawo, Shepherd; Noubiap, Jean Jacques; Mbah, Ikechukwu O; Ekrikpo, Udeme; Kengne, Andre Pascal; Bello, Aminu K; Okpechi, Ikechi G

    2018-05-24

    Continuous ambulatory peritoneal dialysis (CAPD) is the ideal modality for renal replacement therapy in most African settings given that it is relatively cheaper than haemodialysis (HD) and does not require in-centre care. CAPD is, however, not readily utilised as it is often complicated by peritonitis leading to high rates of technique failure. The objective of this study is to assess the prevalence of CAPD-related peritonitis and all-cause mortality in patients treated with CAPD in Africa. We will search PubMed, EMBASE, SCOPUS, Africa Journal Online and Google Scholar for studies conducted in Africa from 1 January 1980 to 30 June 2017 with no language restrictions. Eligible studies will include cross-sectional, prospective observational and cohort studies of patients treated with CAPD. Two authors will independently screen, select studies, extract data and conduct risk of bias assessment. Data consistently reported across studies will be pooled using random-effects meta-analysis. Heterogeneity will be evaluated using Cochrane's Q statistic and quantified using I 2 statistics. Graphical and formal statistical tests will be used to assess for publication bias. Ethical approval will not be needed for this study as data used will be extracted from already published studies. Results of this review will be published in a peer-reviewed journal and presented at conferences. The Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015) framework guided the development of this protocol. CRD42017072966. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets: I. Summary statistics

    USGS Publications Warehouse

    Antweiler, Ronald C.; Taylor, Howard E.

    2008-01-01

    The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.

  11. Negotiation techniques for health care professionals.

    PubMed

    Berlin, Jonathan W; Lexa, Frank J

    2007-07-01

    Negotiation is an essential part of health care practice and is not formally taught during medical training. This article aims to improve the negotiation skills of readers by explaining the essential components of preparation before a negotiation and reviewing common techniques for optimizing negotiated agreements. The terms reservation point, target value, and best alternative to a negotiated agreement are defined, and their importance in negotiation preparation is explained. The concept of anchoring, or making the first offer, in a negotiation is reviewed, and important techniques for team negotiation are provided.

  12. CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.

    ERIC Educational Resources Information Center

    Shermis, Mark D.; Albert, Susan L.

    A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…

  13. Nonparametric Residue Analysis of Dynamic PET Data With Application to Cerebral FDG Studies in Normals.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Spence, Alexander M; Mankoff, David M; O'Sullivan, Janet N; Fitzgerald, Niall; Newman, George C; Krohn, Kenneth A

    2009-06-01

    Kinetic analysis is used to extract metabolic information from dynamic positron emission tomography (PET) uptake data. The theory of indicator dilutions, developed in the seminal work of Meier and Zierler (1954), provides a probabilistic framework for representation of PET tracer uptake data in terms of a convolution between an arterial input function and a tissue residue. The residue is a scaled survival function associated with tracer residence in the tissue. Nonparametric inference for the residue, a deconvolution problem, provides a novel approach to kinetic analysis-critically one that is not reliant on specific compartmental modeling assumptions. A practical computational technique based on regularized cubic B-spline approximation of the residence time distribution is proposed. Nonparametric residue analysis allows formal statistical evaluation of specific parametric models to be considered. This analysis needs to properly account for the increased flexibility of the nonparametric estimator. The methodology is illustrated using data from a series of cerebral studies with PET and fluorodeoxyglucose (FDG) in normal subjects. Comparisons are made between key functionals of the residue, tracer flux, flow, etc., resulting from a parametric (the standard two-compartment of Phelps et al. 1979) and a nonparametric analysis. Strong statistical evidence against the compartment model is found. Primarily these differences relate to the representation of the early temporal structure of the tracer residence-largely a function of the vascular supply network. There are convincing physiological arguments against the representations implied by the compartmental approach but this is the first time that a rigorous statistical confirmation using PET data has been reported. The compartmental analysis produces suspect values for flow but, notably, the impact on the metabolic flux, though statistically significant, is limited to deviations on the order of 3%-4%. The general advantage of the nonparametric residue analysis is the ability to provide a valid kinetic quantitation in the context of studies where there may be heterogeneity or other uncertainty about the accuracy of a compartmental model approximation of the tissue residue.

  14. Many-body formalism for fermions: The partition function

    NASA Astrophysics Data System (ADS)

    Watson, D. K.

    2017-09-01

    The partition function, a fundamental tenet in statistical thermodynamics, contains in principle all thermodynamic information about a system. It encapsulates both microscopic information through the quantum energy levels and statistical information from the partitioning of the particles among the available energy levels. For identical particles, this statistical accounting is complicated by the symmetry requirements of the allowed quantum states. In particular, for Fermi systems, the enforcement of the Pauli principle is typically a numerically demanding task, responsible for much of the cost of the calculations. The interplay of these three elements—the structure of the many-body spectrum, the statistical partitioning of the N particles among the available levels, and the enforcement of the Pauli principle—drives the behavior of mesoscopic and macroscopic Fermi systems. In this paper, we develop an approach for the determination of the partition function, a numerically difficult task, for systems of strongly interacting identical fermions and apply it to a model system of harmonically confined, harmonically interacting fermions. This approach uses a recently introduced many-body method that is an extension of the symmetry-invariant perturbation method (SPT) originally developed for bosons. It uses group theory and graphical techniques to avoid the heavy computational demands of conventional many-body methods which typically scale exponentially with the number of particles. The SPT application of the Pauli principle is trivial to implement since it is done "on paper" by imposing restrictions on the normal-mode quantum numbers at first order in the perturbation. The method is applied through first order and represents an extension of the SPT method to excited states. Our method of determining the partition function and various thermodynamic quantities is accurate and efficient and has the potential to yield interesting insight into the role played by the Pauli principle and the influence of large degeneracies on the emergence of the thermodynamic behavior of large-N systems.

  15. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  16. Medication administration errors from a nursing viewpoint: a formal consensus of definition and scenarios using a Delphi technique.

    PubMed

    Shawahna, Ramzi; Masri, Dina; Al-Gharabeh, Rawan; Deek, Rawan; Al-Thayba, Lama; Halaweh, Masa

    2016-02-01

    To develop and achieve formal consensus on a definition of medication administration errors and scenarios that should or should not be considered as medication administration errors in hospitalised patient settings. Medication administration errors occur frequently in hospitalised patient settings. Currently, there is no formal consensus on a definition of medication administration errors or scenarios that should or should not be considered as medication administration errors. This was a descriptive study using Delphi technique. A panel of experts (n = 50) recruited from major hospitals, nursing schools and universities in Palestine took part in the study. Three Delphi rounds were followed to achieve consensus on a proposed definition of medication administration errors and a series of 61 scenarios representing potential medication administration error situations formulated into a questionnaire. In the first Delphi round, key contact nurses' views on medication administration errors were explored. In the second Delphi round, consensus was achieved to accept the proposed definition of medication administration errors and to include 36 (59%) scenarios and exclude 1 (1·6%) as medication administration errors. In the third Delphi round, consensus was achieved to consider further 14 (23%) and exclude 2 (3·3%) as medication administration errors while the remaining eight (13·1%) were considered equivocal. Of the 61 scenarios included in the Delphi process, experts decided to include 50 scenarios as medication administration errors, exclude three scenarios and include or exclude eight scenarios depending on the individual clinical situation. Consensus on a definition and scenarios representing medication administration errors can be achieved using formal consensus techniques. Researchers should be aware that using different definitions of medication administration errors, inclusion or exclusion of medication administration error situations could significantly affect the rate of medication administration errors reported in their studies. Consensual definitions and medication administration error situations can be used in future epidemiology studies investigating medication administration errors in hospitalised patient settings which may permit and promote direct comparisons of different studies. © 2015 John Wiley & Sons Ltd.

  17. Formal Techniques for Organization Analysis: Task and Resource Management

    DTIC Science & Technology

    1984-06-01

    typical approach has been to base new entities on stereotypical structures and make changes as problems are recognized. Clearly, this is not an...human resources; and provide the means to change and track all 4 L I _ _ _ ____ I I these parameters as they interact with each other and respond to...functioning under internal and external change . 3. Data gathering techniques to allow one to efficiently r,’lect reliable modeling parameters from

  18. Visualizing Matrix Multiplication

    ERIC Educational Resources Information Center

    Daugulis, Peteris; Sondore, Anita

    2018-01-01

    Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…

  19. Fundamental concepts, current regulatory design and interpretation

    EPA Science Inventory

    Developmental toxicology became a formalized field about 50 years ago. Over this time, it has evolved from a largely observational science to one that is highly mechanistic in nature. Our increasing knowledge of mechanism of action, coupled with techniques that facilitate the gen...

  20. LIBERAL JOURNALISM AND AMERICAN EDUCATION, 1914-1941.

    ERIC Educational Resources Information Center

    WALLACE, JAMES M.

    THE RELATIONSHIP BETWEEN TWO LIBERAL JOURNALS AND THE INSTITUTIONS AND PERSONNEL OF FORMAL EDUCATION WAS STUDIED. "THE NATION" AND "NEW REPUBLIC" WERE SELECTED AS BEING INFLUENTIALLY REPRESENTATIVE OF INTELLECTUAL AMERICAN LIBERALISM DURING THE 20TH CENTURY. STANDARD TECHNIQUES OF HISTORICAL RESEARCH WERE EMPLOYED. RELEVANT…

  1. Trimming and procrastination as inversion techniques

    NASA Astrophysics Data System (ADS)

    Backus, George E.

    1996-12-01

    By examining the processes of truncating and approximating the model space (trimming it), and by committing to neither the objectivist nor the subjectivist interpretation of probability (procrastinating), we construct a formal scheme for solving linear and non-linear geophysical inverse problems. The necessary prior information about the correct model xE can be either a collection of inequalities or a probability measure describing where xE was likely to be in the model space X before the data vector y0 was measured. The results of the inversion are (1) a vector z0 that estimates some numerical properties zE of xE; (2) an estimate of the error δz = z0 - zE. As y0 is finite dimensional, so is z0, and hence in principle inversion cannot describe all of xE. The error δz is studied under successively more specialized assumptions about the inverse problem, culminating in a complete analysis of the linear inverse problem with a prior quadratic bound on xE. Our formalism appears to encompass and provide error estimates for many of the inversion schemes current in geomagnetism, and would be equally applicable in geodesy and seismology if adequate prior information were available there. As an idealized example we study the magnetic field at the core-mantle boundary, using satellite measurements of field elements at sites assumed to be almost uniformly distributed on a single spherical surface. Magnetospheric currents are neglected and the crustal field is idealized as a random process with rotationally invariant statistics. We find that an appropriate data compression diagonalizes the variance matrix of the crustal signal and permits an analytic trimming of the idealized problem.

  2. Longitudinal Effects on Early Adolescent Language: A Twin Study

    PubMed Central

    DeThorne, Laura Segebart; Smith, Jamie Mahurin; Betancourt, Mariana Aparicio; Petrill, Stephen A.

    2016-01-01

    Purpose We evaluated genetic and environmental contributions to individual differences in language skills during early adolescence, measured by both language sampling and standardized tests, and examined the extent to which these genetic and environmental effects are stable across time. Method We used structural equation modeling on latent factors to estimate additive genetic, shared environmental, and nonshared environmental effects on variance in standardized language skills (i.e., Formal Language) and productive language-sample measures (i.e., Productive Language) in a sample of 527 twins across 3 time points (mean ages 10–12 years). Results Individual differences in the Formal Language factor were influenced primarily by genetic factors at each age, whereas individual differences in the Productive Language factor were primarily due to nonshared environmental influences. For the Formal Language factor, the stability of genetic effects was high across all 3 time points. For the Productive Language factor, nonshared environmental effects showed low but statistically significant stability across adjacent time points. Conclusions The etiology of language outcomes may differ substantially depending on assessment context. In addition, the potential mechanisms for nonshared environmental influences on language development warrant further investigation. PMID:27732720

  3. Participation Trends and Patterns in Adult Education: 1991-1999. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Creighton, Sean; Hudson, Lisa

    Participation of U.S. adults in formal learning activities during the 1990s was examined by analyzing data from the 1991, 1995, and 1999 Adult Education Surveys that were part of the National Household Education Surveys Program. Overall, participation in adult education between 1991 and 1999 increased among all but one age group (35-44 years), all…

  4. A Short Biography of Paul A. M. Dirac and Historical Development of Dirac Delta Function

    ERIC Educational Resources Information Center

    Debnath, Lokenath

    2013-01-01

    This paper deals with a short biography of Paul Dirac, his first celebrated work on quantum mechanics, his first formal systematic use of the Dirac delta function and his famous work on quantum electrodynamics and quantum statistics. Included are his first discovery of the Dirac relativistic wave equation, existence of positron and the intrinsic…

  5. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  6. Coaching Family Caregivers to become Better Problem Solvers when Caring for Persons with Advanced Cancer

    PubMed Central

    Dionne-Odom, J. Nicholas; Lyons, Kathleen D.; Akyar, Imatullah; Bakitas, Marie

    2016-01-01

    Family caregivers of persons with advanced cancer often take on responsibilities that present daunting and complex problems. Serious problems that go unresolved may be burdensome and result in negative outcomes for caregivers’ psychological and physical health and affect the quality of care delivered to the care recipients with cancer, especially at the end of life. Formal problem-solving training approaches have been developed over the past several decades to assist individuals with managing problems faced in daily life. Several of these problem-solving principles and techniques were incorporated into ENABLE (Educate, Nurture, Advise, Before Life End), an ‘early’ palliative care telehealth intervention for individuals diagnosed with advanced cancer and their family caregivers. A hypothetical case resembling the situations of actual caregiver participants in ENABLE that exemplifies the complex problems that caregivers face is presented followed by presentation of an overview of ENABLE’s problem-solving key principles, techniques and steps in problem-solving support. Though more research is needed to formally test the use of problem-solving support in social work practice, social workers can easily incorporate these techniques into everyday practice. PMID:27143574

  7. Improving the efficiency of single and multiple teleportation protocols based on the direct use of partially entangled states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fortes, Raphael; Rigolin, Gustavo, E-mail: rigolin@ifi.unicamp.br

    We push the limits of the direct use of partially pure entangled states to perform quantum teleportation by presenting several protocols in many different scenarios that achieve the optimal efficiency possible. We review and put in a single formalism the three major strategies known to date that allow one to use partially entangled states for direct quantum teleportation (no distillation strategies permitted) and compare their efficiencies in real world implementations. We show how one can improve the efficiency of many direct teleportation protocols by combining these techniques. We then develop new teleportation protocols employing multipartite partially entangled states. The threemore » techniques are also used here in order to achieve the highest efficiency possible. Finally, we prove the upper bound for the optimal success rate for protocols based on partially entangled Bell states and show that some of the protocols here developed achieve such a bound. -- Highlights: •Optimal direct teleportation protocols using directly partially entangled states. •We put in a single formalism all strategies of direct teleportation. •We extend these techniques for multipartite partially entangle states. •We give upper bounds for the optimal efficiency of these protocols.« less

  8. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  9. Assessment of long-term impact of formal certified cardiopulmonary resuscitation training program among nurses.

    PubMed

    Saramma, P P; Raj, L Suja; Dash, P K; Sarma, P S

    2016-04-01

    Cardiopulmonary resuscitation (CPR) and emergency cardiovascular care guidelines are periodically renewed and published by the American Heart Association. Formal training programs are conducted based on these guidelines. Despite widespread training CPR is often poorly performed. Hospital educators spend a significant amount of time and money in training health professionals and maintaining basic life support (BLS) and advanced cardiac life support (ACLS) skills among them. However, very little data are available in the literature highlighting the long-term impact of these training. To evaluate the impact of formal certified CPR training program on the knowledge and skill of CPR among nurses, to identify self-reported outcomes of attempted CPR and training needs of nurses. Tertiary care hospital, Prospective, repeated-measures design. A series of certified BLS and ACLS training programs were conducted during 2010 and 2011. Written and practical performance tests were done. Final testing was undertaken 3-4 years after training. The sample included all available, willing CPR certified nurses and experience matched CPR noncertified nurses. SPSS for Windows version 21.0. The majority of the 206 nurses (93 CPR certified and 113 noncertified) were females. There was a statistically significant increase in mean knowledge level and overall performance before and after the formal certified CPR training program (P = 0.000). However, the mean knowledge scores were equivalent among the CPR certified and noncertified nurses, although the certified nurses scored a higher mean score (P = 0.140). Formal certified CPR training program increases CPR knowledge and skill. However, significant long-term effects could not be found. There is a need for regular and periodic recertification.

  10. E-assessment of prior learning: a pilot study of interactive assessment of staff with no formal education who are working in Swedish elderly care

    PubMed Central

    2014-01-01

    Background The current paper presents a pilot study of interactive assessment using information and communication technology (ICT) to evaluate the knowledge, skills and abilities of staff with no formal education who are working in Swedish elderly care. Methods Theoretical and practical assessment methods were developed and used with simulated patients and computer-based tests to identify strengths and areas for personal development among staff with no formal education. Results Of the 157 staff with no formal education, 87 began the practical and/or theoretical assessments, and 63 completed both assessments. Several of the staff passed the practical assessments, except the morning hygiene assessment, where several failed. Other areas for staff development, i.e. where several failed (>50%), were the theoretical assessment of the learning objectives: Health, Oral care, Ergonomics, hygiene, esthetic, environmental, Rehabilitation, Assistive technology, Basic healthcare and Laws and organization. None of the staff passed all assessments. Number of years working in elderly care and staff age were not statistically significantly related to the total score of grades on the various learning objectives. Conclusion The interactive assessments were useful in assessing staff members’ practical and theoretical knowledge, skills, and abilities and in identifying areas in need of development. It is important that personnel who lack formal qualifications be clearly identified and given a chance to develop their competence through training, both theoretical and practical. The interactive e-assessment approach analyzed in the present pilot study could serve as a starting point. PMID:24742168

  11. Elicitation of quantitative data from a heterogeneous expert panel: formal process and application in animal health.

    PubMed

    Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S

    2002-02-01

    This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.

  12. Basic statistics (the fundamental concepts).

    PubMed

    Lim, Eric

    2014-12-01

    An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.

  13. An astronomer's guide to period searching

    NASA Astrophysics Data System (ADS)

    Schwarzenberg-Czerny, A.

    2003-03-01

    We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.

  14. Condensate statistics and thermodynamics of weakly interacting Bose gas: Recursion relation approach

    NASA Astrophysics Data System (ADS)

    Dorfman, K. E.; Kim, M.; Svidzinsky, A. A.

    2011-03-01

    We study condensate statistics and thermodynamics of weakly interacting Bose gas with a fixed total number N of particles in a cubic box. We find the exact recursion relation for the canonical ensemble partition function. Using this relation, we calculate the distribution function of condensate particles for N=200. We also calculate the distribution function based on multinomial expansion of the characteristic function. Similar to the ideal gas, both approaches give exact statistical moments for all temperatures in the framework of Bogoliubov model. We compare them with the results of unconstraint canonical ensemble quasiparticle formalism and the hybrid master equation approach. The present recursion relation can be used for any external potential and boundary conditions. We investigate the temperature dependence of the first few statistical moments of condensate fluctuations as well as thermodynamic potentials and heat capacity analytically and numerically in the whole temperature range.

  15. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  16. Projection-operator calculations of the lowest e(-)-He resonance

    NASA Technical Reports Server (NTRS)

    Berk, A.; Bhatia, A. K.; Junker, B. R.; Temkin, A.

    1986-01-01

    The 1s (2s)2:2S Schulz resonance of He(-) is investigated theoretically, applying the full projection-operator formalism developed by Temkin and Bhatia (1985) in a Rayleigh-Ritz variational calculation. The technique is described in detail, and results for five different approximations of the He target state are presented in a table. Good convergence is obtained, but it is found that even the best calculated value of the resonance is about 130 meV higher than the experimentally measured value of 19.367 + or - 0.007 eV (Brunt et al., 1977), a discrepancy attributed to the contribution of the shift in the Feshbach formalism.

  17. A Formalisation of Adaptable Pervasive Flows

    NASA Astrophysics Data System (ADS)

    Bucchiarone, Antonio; Lafuente, Alberto Lluch; Marconi, Annapaola; Pistore, Marco

    Adaptable Pervasive Flows is a novel workflow-based paradigm for the design and execution of pervasive applications, where dynamic workflows situated in the real world are able to modify their execution in order to adapt to changes in their environment. In this paper, we study a formalisation of such flows by means of a formal flow language. More precisely, we define APFoL (Adaptable Pervasive Flow Language) and formalise its textual notation by encoding it in Blite, a formalisation of WS-BPEL. The encoding in Blite equips the language with a formal semantics and enables the use of automated verification techniques. We illustrate the approach with an example of a Warehouse Case Study.

  18. Assurance Cases for Proofs as Evidence

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Gurfinkel, Arie; Wallnau, Kurt; Weinstock, Charles

    2009-01-01

    Proof-carrying code (PCC) provides a 'gold standard' for establishing formal and objective confidence in program behavior. However, in order to extend the benefits of PCC - and other formal certification techniques - to realistic systems, we must establish the correspondence of a mathematical proof of a program's semantics and its actual behavior. In this paper, we argue that assurance cases are an effective means of establishing such a correspondence. To this end, we present an assurance case pattern for arguing that a proof is free from various proof hazards. We also instantiate this pattern for a proof-based mechanism to provide evidence about a generic medical device software.

  19. Thermodynamics of adaptive molecular resolution

    NASA Astrophysics Data System (ADS)

    Delgado-Buscalioni, R.

    2016-11-01

    A relatively general thermodynamic formalism for adaptive molecular resolution (AMR) is presented. The description is based on the approximation of local thermodynamic equilibrium and considers the alchemic parameter λ as the conjugate variable of the potential energy difference between the atomistic and coarse-grained model Φ=U(1)-U(0). The thermodynamic formalism recovers the relations obtained from statistical mechanics of H-AdResS (Español et al., J. Chem. Phys. 142, 064115, 2015 (doi:10.1063/1.4907006)) and provides relations between the free energy compensation and thermodynamic potentials. Inspired by this thermodynamic analogy, several generalizations of AMR are proposed, such as the exploration of new Maxwell relations and how to treat λ and Φ as `real' thermodynamic variables. This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  20. An Approach to Goal-Statement Evaluation

    ERIC Educational Resources Information Center

    Reiner, John R.; Robinson, Donald W.

    1969-01-01

    "The results of this study support the proposition that the application of environmental assessment techniques based on CUES items provides information which can help evaluate the formal goals of an institution in terms of the degree to which the institutional environment is facilitative of those goals. (Author)

Top