ERIC Educational Resources Information Center
Jacob, Bridgette L.
2013-01-01
The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…
NASA Astrophysics Data System (ADS)
Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.
2018-01-01
We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.
NASA Astrophysics Data System (ADS)
Matsubara, Takahiko
2003-02-01
We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.
NASA Technical Reports Server (NTRS)
Staubert, R.
1985-01-01
Methods for calculating the statistical significance of excess events and the interpretation of the formally derived values are discussed. It is argued that a simple formula for a conservative estimate should generally be used in order to provide a common understanding of quoted values.
A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation
NASA Astrophysics Data System (ADS)
Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui
Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.
Landau's statistical mechanics for quasi-particle models
NASA Astrophysics Data System (ADS)
Bannur, Vishnu M.
2014-04-01
Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.
Formative Use of Intuitive Analysis of Variance
ERIC Educational Resources Information Center
Trumpower, David L.
2013-01-01
Students' informal inferential reasoning (IIR) is often inconsistent with the normative logic underlying formal statistical methods such as Analysis of Variance (ANOVA), even after instruction. In two experiments reported here, student's IIR was assessed using an intuitive ANOVA task at the beginning and end of a statistics course. In both…
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism
NASA Astrophysics Data System (ADS)
Parish, Eric; Duraisamy, Karthk
2017-11-01
The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
1983-06-16
has been advocated by Gnanadesikan and ilk (1969), and others in the literature. This suggests that, if we use the formal signficance test type...American Statistical Asso., 62, 1159-1178. Gnanadesikan , R., and Wilk, M..B. (1969). Data Analytic Methods in Multi- variate Statistical Analysis. In
NASA Astrophysics Data System (ADS)
Bommier, Véronique
2017-11-01
Context. In previous papers of this series, we presented a formalism able to account for both statistical equilibrium of a multilevel atom and coherent and incoherent scatterings (partial redistribution). Aims: This paper provides theoretical expressions of the redistribution function for the two-term atom. This redistribution function includes both coherent (RII) and incoherent (RIII) scattering contributions with their branching ratios. Methods: The expressions were derived by applying the formalism outlined above. The statistical equilibrium equation for the atomic density matrix is first formally solved in the case of the two-term atom with unpolarized and infinitely sharp lower levels. Then the redistribution function is derived by substituting this solution for the expression of the emissivity. Results: Expressions are provided for both magnetic and non-magnetic cases. Atomic fine structure is taken into account. Expressions are also separately provided under zero and non-zero hyperfine structure. Conclusions: Redistribution functions are widely used in radiative transfer codes. In our formulation, collisional transitions between Zeeman sublevels within an atomic level (depolarizing collisions effect) are taken into account when possible (I.e., in the non-magnetic case). However, the need for a formal solution of the statistical equilibrium as a preliminary step prevents us from taking into account collisional transfers between the levels of the upper term. Accounting for these collisional transfers could be done via a numerical solution of the statistical equilibrium equation system.
NASA Astrophysics Data System (ADS)
Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo
2017-08-01
We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.
Testing the statistical compatibility of independent data sets
NASA Astrophysics Data System (ADS)
Maltoni, M.; Schwetz, T.
2003-08-01
We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed.
Small sample estimation of the reliability function for technical products
NASA Astrophysics Data System (ADS)
Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.
2017-12-01
It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.
Cost model validation: a technical and cultural approach
NASA Technical Reports Server (NTRS)
Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.
2001-01-01
This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.
Formal Operations and Learning Style Predict Success in Statistics and Computer Science Courses.
ERIC Educational Resources Information Center
Hudak, Mary A.; Anderson, David E.
1990-01-01
Studies 94 undergraduate students in introductory statistics and computer science courses. Applies Formal Operations Reasoning Test (FORT) and Kolb's Learning Style Inventory (LSI). Finds that substantial numbers of students have not achieved the formal operation level of cognitive maturity. Emphasizes need to examine students learning style and…
The changing landscape of astrostatistics and astroinformatics
NASA Astrophysics Data System (ADS)
Feigelson, Eric D.
2017-06-01
The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drzymala, R. E., E-mail: drzymala@wustl.edu; Alvarez, P. E.; Bednarz, G.
2015-11-15
Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods:more » Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS dose-rate were 0.999 ± 0.009 (TG-21), 0.991 ± 0.013 (TG-51), 1.000 ± 0.009 (IAEA), and 1.009 ± 0.012 (in-air). There were no statistically significant differences (i.e., p > 0.05) between the two ionization chambers for the TG-21 protocol applied to all dosimetry phantoms. The mean results using the TG-51 protocol were notably lower than those for the other dosimetry protocols, with a standard deviation 2–3 times larger. The in-air protocol was not statistically different from TG-21 for the A16 chamber in the liquid water or ABS phantoms (p = 0.300 and p = 0.135) but was statistically different from TG-21 for the PTW chamber in all phantoms (p = 0.006 for Solid Water, 0.014 for liquid water, and 0.020 for ABS). Results of IAEA formalism were statistically different from TG-21 results only for the combination of the A16 chamber with the liquid water phantom (p = 0.017). In the latter case, dose-rates measured with the two protocols differed by only 0.4%. For other phantom-ionization-chamber combinations, the new IAEA formalism was not statistically different from TG-21. Conclusions: Although further investigation is needed to validate the new protocols for other ionization chambers, these results can serve as a reference to quantitatively compare different calibration protocols and ionization chambers if a particular method is chosen by a professional society to serve as a standardized calibration protocol.« less
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
Statistical inference for tumor growth inhibition T/C ratio.
Wu, Jianrong
2010-09-01
The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.
On the simulation of indistinguishable fermions in the many-body Wigner formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.
2015-01-01
The simulation of quantum systems consisting of interacting, indistinguishable fermions is an incredible mathematical problem which poses formidable numerical challenges. Many sophisticated methods addressing this problem are available which are based on the many-body Schrödinger formalism. Recently a Monte Carlo technique for the resolution of the many-body Wigner equation has been introduced and successfully applied to the simulation of distinguishable, spinless particles. This numerical approach presents several advantages over other methods. Indeed, it is based on an intuitive formalism in which quantum systems are described in terms of a quasi-distribution function, and highly scalable due to its Monte Carlo nature.more » In this work, we extend the many-body Wigner Monte Carlo method to the simulation of indistinguishable fermions. To this end, we first show how fermions are incorporated into the Wigner formalism. Then we demonstrate that the Pauli exclusion principle is intrinsic to the formalism. As a matter of fact, a numerical simulation of two strongly interacting fermions (electrons) is performed which clearly shows the appearance of a Fermi (or exchange–correlation) hole in the phase-space, a clear signature of the presence of the Pauli principle. To conclude, we simulate 4, 8 and 16 non-interacting fermions, isolated in a closed box, and show that, as the number of fermions increases, we gradually recover the Fermi–Dirac statistics, a clear proof of the reliability of our proposed method for the treatment of indistinguishable particles.« less
Confirmatory and Competitive Evaluation of Alternative Gene-Environment Interaction Hypotheses
ERIC Educational Resources Information Center
Belsky, Jay; Pluess, Michael; Widaman, Keith F.
2013-01-01
Background: Most gene-environment interaction (GXE) research, though based on clear, vulnerability-oriented hypotheses, is carried out using exploratory rather than hypothesis-informed statistical tests, limiting power and making formal evaluation of competing GXE propositions difficult. Method: We present and illustrate a new regression technique…
NASA Astrophysics Data System (ADS)
Pavlis, Nikolaos K.
Geomatics is a trendy term that has been used in recent years to describe academic departments that teach and research theories, methods, algorithms, and practices used in processing and analyzing data related to the Earth and other planets. Naming trends aside, geomatics could be considered as the mathematical and statistical “toolbox” that allows Earth scientists to extract information about physically relevant parameters from the available data and accompany such information with some measure of its reliability. This book is an attempt to present the mathematical-statistical methods used in data analysis within various disciplines—geodesy, geophysics, photogrammetry and remote sensing—from a unifying perspective that inverse problem formalism permits. At the same time, it allows us to stretch the relevance of statistical methods in achieving an optimal solution.
Wu, Baolin
2006-02-15
Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.
NASA Astrophysics Data System (ADS)
Chatterjee, Saikat; Koopmans, Léon V. E.
2018-02-01
In the last decade, the detection of individual massive dark matter sub-haloes has been possible using potential correction formalism in strong gravitational lens imaging. Here, we propose a statistical formalism to relate strong gravitational lens surface brightness anomalies to the lens potential fluctuations arising from dark matter distribution in the lens galaxy. We consider these fluctuations as a Gaussian random field in addition to the unperturbed smooth lens model. This is very similar to weak lensing formalism and we show that in this way we can measure the power spectrum of these perturbations to the potential. We test the method by applying it to simulated mock lenses of different geometries and by performing an MCMC analysis of the theoretical power spectra. This method can measure density fluctuations in early type galaxies on scales of 1-10 kpc at typical rms levels of a per cent, using a single lens system observed with the Hubble Space Telescope with typical signal-to-noise ratios obtained in a single orbit.
A new statistical method for characterizing the atmospheres of extrasolar planets
NASA Astrophysics Data System (ADS)
Henderson, Cassandra S.; Skemer, Andrew J.; Morley, Caroline V.; Fortney, Jonathan J.
2017-10-01
By detecting light from extrasolar planets, we can measure their compositions and bulk physical properties. The technologies used to make these measurements are still in their infancy, and a lack of self-consistency suggests that previous observations have underestimated their systemic errors. We demonstrate a statistical method, newly applied to exoplanet characterization, which uses a Bayesian formalism to account for underestimated errorbars. We use this method to compare photometry of a substellar companion, GJ 758b, with custom atmospheric models. Our method produces a probability distribution of atmospheric model parameters including temperature, gravity, cloud model (fsed) and chemical abundance for GJ 758b. This distribution is less sensitive to highly variant data and appropriately reflects a greater uncertainty on parameter fits.
Forecasts of non-Gaussian parameter spaces using Box-Cox transformations
NASA Astrophysics Data System (ADS)
Joachimi, B.; Taylor, A. N.
2011-09-01
Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.
Experimental design and statistical methods for improved hit detection in high-throughput screening.
Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert
2010-09-01
Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.
Shrestha, R; Shakya, R M; Khan A, A
2016-01-01
Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.
Using Informal Inferential Reasoning to Develop Formal Concepts: Analyzing an Activity
ERIC Educational Resources Information Center
Weinberg, Aaron; Wiesner, Emilie; Pfaff, Thomas J.
2010-01-01
Inferential reasoning is a central component of statistics. Researchers have suggested that students should develop an informal understanding of the ideas that underlie inference before learning the concepts formally. This paper presents a hands-on activity that is designed to help students in an introductory statistics course draw informal…
NASA Astrophysics Data System (ADS)
Nearing, G. S.
2014-12-01
Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.
1990-12-01
stated these problems -- racism , sexism , drug and alcohol abuse -- were a result of the poor leadership ability in Navymiddle management [Ref. 3... The HRM program instituted a formal course of instruction to teach leadership theories . The leadership training of the Human Resource Management...management practices based on the guidelines developed by W. E. Demming [Ref. 14]. The TQL practice involves integrating management and statistical methods to
Exploring High School Students Beginning Reasoning about Significance Tests with Technology
ERIC Educational Resources Information Center
García, Víctor N.; Sánchez, Ernesto
2017-01-01
In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…
A Statistical Ontology-Based Approach to Ranking for Multiword Search
ERIC Educational Resources Information Center
Kim, Jinwoo
2013-01-01
Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…
ERIC Educational Resources Information Center
Arena, Dylan A.; Schwartz, Daniel L.
2014-01-01
Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…
Statistical science: a grammar for research.
Cox, David R
2017-06-01
I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.
Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?
Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R
2013-01-01
The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.
Saramma, P. P.; Raj, L. Suja; Dash, P. K.; Sarma, P. S.
2016-01-01
Context: Cardiopulmonary resuscitation (CPR) and emergency cardiovascular care guidelines are periodically renewed and published by the American Heart Association. Formal training programs are conducted based on these guidelines. Despite widespread training CPR is often poorly performed. Hospital educators spend a significant amount of time and money in training health professionals and maintaining basic life support (BLS) and advanced cardiac life support (ACLS) skills among them. However, very little data are available in the literature highlighting the long-term impact of these training. Aims: To evaluate the impact of formal certified CPR training program on the knowledge and skill of CPR among nurses, to identify self-reported outcomes of attempted CPR and training needs of nurses. Setting and Design: Tertiary care hospital, Prospective, repeated-measures design. Subjects and Methods: A series of certified BLS and ACLS training programs were conducted during 2010 and 2011. Written and practical performance tests were done. Final testing was undertaken 3–4 years after training. The sample included all available, willing CPR certified nurses and experience matched CPR noncertified nurses. Statistical Analysis Used: SPSS for Windows version 21.0. Results: The majority of the 206 nurses (93 CPR certified and 113 noncertified) were females. There was a statistically significant increase in mean knowledge level and overall performance before and after the formal certified CPR training program (P = 0.000). However, the mean knowledge scores were equivalent among the CPR certified and noncertified nurses, although the certified nurses scored a higher mean score (P = 0.140). Conclusions: Formal certified CPR training program increases CPR knowledge and skill. However, significant long-term effects could not be found. There is a need for regular and periodic recertification. PMID:27303137
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2014-01-01
This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.
Granular statistical mechanics - Building on the legacy of Sir Sam Edwards
NASA Astrophysics Data System (ADS)
Blumenfeld, Raphael
When Sir Sam Edwards laid down the foundations for the statistical mechanics of jammed granular materials he opened a new field in soft condensed matter and many followed. In this presentation we review briefly the Edwards formalism and some of its less discussed consequences. We point out that the formalism is useful for other classes of systems - cellular and porous materials. A certain shortcoming of the original formalism is then discussed and a modification to overcome it is proposed. Finally, a derivation of an equation of state with the new formalism is presented; the equation of state is analogous to the PVT relation for thermal gases, relating the volume, the boundary stress and measures of the structural and stress fluctuations. NUDT, Changsha, China, Imperial College London, UK, Cambridge University, UK.
ERIC Educational Resources Information Center
Ampofo, S. Y.; Bizimana, B.; Ndayambaje, I.; Karongo, V.; Lawrence, K. Lyn; Orodho, J. A.
2015-01-01
This study examined the social and spill-over benefits as motivating factors to investment in formal education in selected countries in Africa. The paper had three objectives, namely) to profile the key statistics of formal schooling; ii) examine the formal education and iii) link national goals of education with expectations in Ghana, Kenya and…
Effectiveness of groundwater governance structures and institutions in Tanzania
NASA Astrophysics Data System (ADS)
Gudaga, J. L.; Kabote, S. J.; Tarimo, A. K. P. R.; Mosha, D. B.; Kashaigili, J. J.
2018-05-01
This paper examines effectiveness of groundwater governance structures and institutions in Mbarali District, Mbeya Region. The paper adopts exploratory sequential research design to collect quantitative and qualitative data. A random sample of 90 groundwater users with 50% women was involved in the survey. Descriptive statistics, Kruskal-Wallis H test and Mann-Whitney U test were used to compare the differences in responses between groups, while qualitative data were subjected to content analysis. The results show that the Village Councils and Community Water Supply Organizations (COWSOs) were effective in governing groundwater. The results also show statistical significant difference on the overall extent of effectiveness of the Village Councils in governing groundwater between villages ( P = 0.0001), yet there was no significant difference ( P > 0.05) between male and female responses on the effectiveness of Village Councils, village water committees and COWSOs. The Mann-Whitney U test showed statistical significant difference between male and female responses on effectiveness of formal and informal institutions ( P = 0.0001), such that informal institutions were effective relative to formal institutions. The Kruskal-Wallis H test also showed statistical significant difference ( P ≤ 0.05) on the extent of effectiveness of formal institutions, norms and values between low, medium and high categories. The paper concludes that COWSOs were more effective in governing groundwater than other groundwater governance structures. Similarly, norms and values were more effective than formal institutions. The paper recommends sensitization and awareness creation on formal institutions so that they can influence water users' behaviour to govern groundwater.
Mending the Gap, An Effort to Aid the Transfer of Formal Methods Technology
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly
2009-01-01
Formal methods can be applied to many of the development and verification activities required for civil avionics software. RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment Certification, gives a brief description of using formal methods as an alternate method of compliance with the objectives of that standard. Despite this, the avionics industry at large has been hesitant to adopt formal methods, with few developers have actually used formal methods for certification credit. Why is this so, given the volume of evidence of the benefits of formal methods? This presentation will explore some of the challenges to using formal methods in a certification context and describe the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to develop guidance to make the use of formal methods a recognized approach.
Great Computational Intelligence in the Formal Sciences via Analogical Reasoning
2017-05-08
computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational
Studies in Non-Equilibrium Statistical Mechanics.
1982-09-01
in the formalism, and this is used to simulate the effects of rotational states and collisions. At each stochastic step the energy changes in the...uses of this method. 10. A Scaling Theoretical Analysis of Vibrational Relaxation Experiments: Rotational Effects and Long-Range Collisions 0...in- elude rotational effects through the rotational energy gaps and the rotational distributions. The variables in this theory are a fundamental set
Formal methods technology transfer: Some lessons learned
NASA Technical Reports Server (NTRS)
Hamilton, David
1992-01-01
IBM has a long history in the application of formal methods to software development and verification. There have been many successes in the development of methods, tools and training to support formal methods. And formal methods have been very successful on several projects. However, the use of formal methods has not been as widespread as hoped. This presentation summarizes several approaches that have been taken to encourage more widespread use of formal methods, and discusses the results so far. The basic problem is one of technology transfer, which is a very difficult problem. It is even more difficult for formal methods. General problems of technology transfer, especially the transfer of formal methods technology, are also discussed. Finally, some prospects for the future are mentioned.
Wingate, Peter H; Thornton, George C; McIntyre, Kelly S; Frame, Jennifer H
2003-02-01
The present study examined relationships between reduction-in-force (RIF) personnel practices, presentation of statistical evidence, and litigation outcomes. Policy capturing methods were utilized to analyze the components of 115 federal district court opinions involving age discrimination disparate treatment allegations and organizational downsizing. Univariate analyses revealed meaningful links between RIF personnel practices, use of statistical evidence, and judicial verdict. The defendant organization was awarded summary judgment in 73% of the claims included in the study. Judicial decisions in favor of the defendant organization were found to be significantly related to such variables as formal performance appraisal systems, termination decision review within the organization, methods of employee assessment and selection for termination, and the presence of a concrete layoff policy. The use of statistical evidence in ADEA disparate treatment litigation was investigated and found to be a potentially persuasive type of indirect evidence. Legal, personnel, and evidentiary ramifications are reviewed, and a framework of downsizing mechanics emphasizing legal defensibility is presented.
Statistics of Macroturbulence from Flow Equations
NASA Astrophysics Data System (ADS)
Marston, Brad; Iadecola, Thomas; Qi, Wanming
2012-02-01
Probability distribution functions of stochastically-driven and frictionally-damped fluids are governed by a linear framework that resembles quantum many-body theory. Besides the Fokker-Planck approach, there is a closely related Hopf functional methodfootnotetextOokie Ma and J. B. Marston, J. Stat. Phys. Th. Exp. P10007 (2005).; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we generalize the flow equation approachfootnotetextF. Wegner, Ann. Phys. 3, 77 (1994). (also known as the method of continuous unitary transformationsfootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994).) to find the zero mode. We test the approach using a prototypical model of geophysical and astrophysical flows on a rotating sphere that spontaneously organizes into a coherent jet. Good agreement is found with low-order equal-time statistics accumulated by direct numerical simulation, the traditional method. Different choices for the generators of the continuous transformations, and for closure approximations of the operator algebra, are discussed.
Memory Effects and Nonequilibrium Correlations in the Dynamics of Open Quantum Systems
NASA Astrophysics Data System (ADS)
Morozov, V. G.
2018-01-01
We propose a systematic approach to the dynamics of open quantum systems in the framework of Zubarev's nonequilibrium statistical operator method. The approach is based on the relation between ensemble means of the Hubbard operators and the matrix elements of the reduced statistical operator of an open quantum system. This key relation allows deriving master equations for open systems following a scheme conceptually identical to the scheme used to derive kinetic equations for distribution functions. The advantage of the proposed formalism is that some relevant dynamical correlations between an open system and its environment can be taken into account. To illustrate the method, we derive a non-Markovian master equation containing the contribution of nonequilibrium correlations associated with energy conservation.
The potential for increased power from combining P-values testing the same hypothesis.
Ganju, Jitendra; Julie Ma, Guoguang
2017-02-01
The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.
Granular statistical mechanics - a personal perspective
NASA Astrophysics Data System (ADS)
Blumenfeld, R.; Edwards, S. F.
2014-10-01
The science of granular matter has expanded from an activity for specialised engineering applications to a fundamental field in its own right. This has been accompanied by an explosion of research and literature, which cannot be reviewed in one paper. A key to progress in this field is the formulation of a statistical mechanical formalism that could help develop equations of state and constitutive relations. This paper aims at reviewing some milestones in this direction. An essential basic step toward the development of any static and quasi-static theory of granular matter is a systematic and useful method to quantify the grain-scale structure and we start with a review of such a method. We then review and discuss the ongoing attempt to construct a statistical mechanical theory of granular systems. Along the way, we will clarify a number of misconceptions in the field, as well as highlight several outstanding problems.
Statistical evidence for common ancestry: Application to primates.
Baum, David A; Ané, Cécile; Larget, Bret; Solís-Lemus, Claudia; Ho, Lam Si Tung; Boone, Peggy; Drummond, Chloe P; Bontrager, Martin; Hunter, Steven J; Saucier, William
2016-06-01
Since Darwin, biologists have come to recognize that the theory of descent from common ancestry (CA) is very well supported by diverse lines of evidence. However, while the qualitative evidence is overwhelming, we also need formal methods for quantifying the evidential support for CA over the alternative hypothesis of separate ancestry (SA). In this article, we explore a diversity of statistical methods using data from the primates. We focus on two alternatives to CA, species SA (the separate origin of each named species) and family SA (the separate origin of each family). We implemented statistical tests based on morphological, molecular, and biogeographic data and developed two new methods: one that tests for phylogenetic autocorrelation while correcting for variation due to confounding ecological traits and a method for examining whether fossil taxa have fewer derived differences than living taxa. We overwhelmingly rejected both species and family SA with infinitesimal P values. We compare these results with those from two companion papers, which also found tremendously strong support for the CA of all primates, and discuss future directions and general philosophical issues that pertain to statistical testing of historical hypotheses such as CA. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.
Redshift data and statistical inference
NASA Technical Reports Server (NTRS)
Newman, William I.; Haynes, Martha P.; Terzian, Yervant
1994-01-01
Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.
2014-01-01
Background The current paper presents a pilot study of interactive assessment using information and communication technology (ICT) to evaluate the knowledge, skills and abilities of staff with no formal education who are working in Swedish elderly care. Methods Theoretical and practical assessment methods were developed and used with simulated patients and computer-based tests to identify strengths and areas for personal development among staff with no formal education. Results Of the 157 staff with no formal education, 87 began the practical and/or theoretical assessments, and 63 completed both assessments. Several of the staff passed the practical assessments, except the morning hygiene assessment, where several failed. Other areas for staff development, i.e. where several failed (>50%), were the theoretical assessment of the learning objectives: Health, Oral care, Ergonomics, hygiene, esthetic, environmental, Rehabilitation, Assistive technology, Basic healthcare and Laws and organization. None of the staff passed all assessments. Number of years working in elderly care and staff age were not statistically significantly related to the total score of grades on the various learning objectives. Conclusion The interactive assessments were useful in assessing staff members’ practical and theoretical knowledge, skills, and abilities and in identifying areas in need of development. It is important that personnel who lack formal qualifications be clearly identified and given a chance to develop their competence through training, both theoretical and practical. The interactive e-assessment approach analyzed in the present pilot study could serve as a starting point. PMID:24742168
On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method
Roux, Benoît; Weare, Jonathan
2013-01-01
An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140
Formal Methods Tool Qualification
NASA Technical Reports Server (NTRS)
Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain
2017-01-01
Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.
NASA Astrophysics Data System (ADS)
Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha
1998-09-01
Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.
Relative mass distributions of neutron-rich thermally fissile nuclei within a statistical model
NASA Astrophysics Data System (ADS)
Kumar, Bharat; Kannan, M. T. Senthil; Balasubramaniam, M.; Agrawal, B. K.; Patra, S. K.
2017-09-01
We study the binary mass distribution for the recently predicted thermally fissile neutron-rich uranium and thorium nuclei using a statistical model. The level density parameters needed for the study are evaluated from the excitation energies of the temperature-dependent relativistic mean field formalism. The excitation energy and the level density parameter for a given temperature are employed in the convolution integral method to obtain the probability of the particular fragmentation. As representative cases, we present the results for the binary yields of 250U and 254Th. The relative yields are presented for three different temperatures: T =1 , 2, and 3 MeV.
Guidance for Using Formal Methods in a Certification Context
NASA Technical Reports Server (NTRS)
Brown, Duncan; Delseny, Herve; Hayhurst, Kelly; Wiels, Virginie
2010-01-01
This paper discusses some of the challenges to using formal methods in a certification context and describes the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to propose guidance to make the use of formal methods a recognized approach. This guidance, expected to take the form of a Formal Methods Technical Supplement to DO-178C/ED-12C, is described, including the activities that are needed when using formal methods, new or modified objectives with respect to the core DO-178C/ED-12C document, and evidence needed for meeting those objectives.
Yu, Alexander C; Cimino, James J
2011-04-01
Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Recall and interclass correlation coefficient. Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p<0.05). Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. Copyright © 2011 Elsevier Inc. All rights reserved.
Yu, Alexander C.; Cimino, James J.
2012-01-01
Objective Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Design Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Measurements Recall and interclass correlation coefficient. Results Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p < 0.05). Conclusion Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. PMID:21262390
Inference of missing data and chemical model parameters using experimental statistics
NASA Astrophysics Data System (ADS)
Casey, Tiernan; Najm, Habib
2017-11-01
A method for determining the joint parameter density of Arrhenius rate expressions through the inference of missing experimental data is presented. This approach proposes noisy hypothetical data sets from target experiments and accepts those which agree with the reported statistics, in the form of nominal parameter values and their associated uncertainties. The data exploration procedure is formalized using Bayesian inference, employing maximum entropy and approximate Bayesian computation methods to arrive at a joint density on data and parameters. The method is demonstrated in the context of reactions in the H2-O2 system for predictive modeling of combustion systems of interest. Work supported by the US DOE BES CSGB. Sandia National Labs is a multimission lab managed and operated by Nat. Technology and Eng'g Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell Intl, for the US DOE NCSA under contract DE-NA-0003525.
Robust inference for group sequential trials.
Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei
2017-03-01
For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.
Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.
Verde, Pablo E
2010-12-30
In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.
Basic statistics (the fundamental concepts).
Lim, Eric
2014-12-01
An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.
Influence of Culture on Secondary School Students' Understanding of Statistics: A Fijian Perspective
ERIC Educational Resources Information Center
Sharma, Sashi
2014-01-01
Although we use statistical notions daily in making decisions, research in statistics education has focused mostly on formal statistics. Further, everyday culture may influence informal ideas of statistics. Yet, there appears to be minimal literature that deals with the educational implications of the role of culture. This paper will discuss the…
Flow Equation Approach to the Statistics of Nonlinear Dynamical Systems
NASA Astrophysics Data System (ADS)
Marston, J. B.; Hastings, M. B.
2005-03-01
The probability distribution function of non-linear dynamical systems is governed by a linear framework that resembles quantum many-body theory, in which stochastic forcing and/or averaging over initial conditions play the role of non-zero . Besides the well-known Fokker-Planck approach, there is a related Hopf functional methodootnotetextUriel Frisch, Turbulence: The Legacy of A. N. Kolmogorov (Cambridge University Press, 1995) chapter 9.5.; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we investigate the method of continuous unitary transformationsootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994). (also known as the flow equation approachootnotetextF. Wegner, Ann. Phys. 3, 77 (1994).), suitably generalized to the diagonalization of non-Hermitian matrices. Comparison to the more traditional cumulant expansion method is illustrated with low-dimensional attractors. The treatment of high-dimensional dynamical systems is also discussed.
Effects of Mobile Learning in Medical Education: A Counterfactual Evaluation.
Briz-Ponce, Laura; Juanes-Méndez, Juan Antonio; García-Peñalvo, Francisco José; Pereira, Anabela
2016-06-01
The aim of this research is to contribute to the general system education providing new insights and resources. This study performs a quasi-experimental study at University of Salamanca with 30 students to compare results between using an anatomic app for learning and the formal traditional method conducted by a teacher. The findings of the investigation suggest that the performance of learners using mobile apps is statistical better than the students using the traditional method. However, mobile devices should be considered as an additional tool to complement the teachers' explanation and it is necessary to overcome different barriers and challenges to adopt these pedagogical methods at University.
University of California Conference on Statistical Mechanics (4th) Held March 26-28, 1990
1990-03-28
and S. Lago, Chem. Phys., Z, 5750 (1983) Shear Viscosity Calculation via Equilibrium Molecular Dynamics: Einstenian vs. Green - Kubo Formalism by Adel A...through the application of the Green - Kubo approach. Although the theoretical equivalence between both formalisms was demonstrated by Helfand [3], their...like equations and of different expressions based on the Green - Kubo formalism. In contrast to Hoheisel and Vogelsang’s conclusions [2], we find that
Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
John A. Krommes
2001-02-16
A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which providesmore » a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations, stochasticity criteria for quasilinear theory, formal aspects of resonance-broadening theory, Novikov's theorem, the treatment of weak inhomogeneity, the derivation of the Vlasov weak-turbulence wave kinetic equation from a fully renormalized description, some features of a code for solving the direct-interaction approximation and related Markovian closures, the details of the solution of the EDQNM closure for a solvable three-wave model, and the notation used in the article.« less
Statistical mechanics of few-particle systems: exact results for two useful models
NASA Astrophysics Data System (ADS)
Miranda, Enrique N.
2017-11-01
The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaurov, Alexander A., E-mail: kaurov@uchicago.edu
The methods for studying the epoch of cosmic reionization vary from full radiative transfer simulations to purely analytical models. While numerical approaches are computationally expensive and are not suitable for generating many mock catalogs, analytical methods are based on assumptions and approximations. We explore the interconnection between both methods. First, we ask how the analytical framework of excursion set formalism can be used for statistical analysis of numerical simulations and visual representation of the morphology of ionization fronts. Second, we explore the methods of training the analytical model on a given numerical simulation. We present a new code which emergedmore » from this study. Its main application is to match the analytical model with a numerical simulation. Then, it allows one to generate mock reionization catalogs with volumes exceeding the original simulation quickly and computationally inexpensively, meanwhile reproducing large-scale statistical properties. These mock catalogs are particularly useful for cosmic microwave background polarization and 21 cm experiments, where large volumes are required to simulate the observed signal.« less
Discrete approach to stochastic parametrization and dimension reduction in nonlinear dynamics.
Chorin, Alexandre J; Lu, Fei
2015-08-11
Many physical systems are described by nonlinear differential equations that are too complicated to solve in full. A natural way to proceed is to divide the variables into those that are of direct interest and those that are not, formulate solvable approximate equations for the variables of greater interest, and use data and statistical methods to account for the impact of the other variables. In the present paper we consider time-dependent problems and introduce a fully discrete solution method, which simplifies both the analysis of the data and the numerical algorithms. The resulting time series are identified by a NARMAX (nonlinear autoregression moving average with exogenous input) representation familiar from engineering practice. The connections with the Mori-Zwanzig formalism of statistical physics are discussed, as well as an application to the Lorenz 96 system.
A Geometrical-Statistical Approach to Outlier Removal for TDOA Measurements
NASA Astrophysics Data System (ADS)
Compagnoni, Marco; Pini, Alessia; Canclini, Antonio; Bestagini, Paolo; Antonacci, Fabio; Tubaro, Stefano; Sarti, Augusto
2017-08-01
The curse of outlier measurements in estimation problems is a well known issue in a variety of fields. Therefore, outlier removal procedures, which enables the identification of spurious measurements within a set, have been developed for many different scenarios and applications. In this paper, we propose a statistically motivated outlier removal algorithm for time differences of arrival (TDOAs), or equivalently range differences (RD), acquired at sensor arrays. The method exploits the TDOA-space formalism and works by only knowing relative sensor positions. As the proposed method is completely independent from the application for which measurements are used, it can be reliably used to identify outliers within a set of TDOA/RD measurements in different fields (e.g. acoustic source localization, sensor synchronization, radar, remote sensing, etc.). The proposed outlier removal algorithm is validated by means of synthetic simulations and real experiments.
Data-Driven Learning of Q-Matrix
Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2013-01-01
The recent surge of interests in cognitive assessment has led to developments of novel statistical models for diagnostic classification. Central to many such models is the well-known Q-matrix, which specifies the item–attribute relationships. This article proposes a data-driven approach to identification of the Q-matrix and estimation of related model parameters. A key ingredient is a flexible T-matrix that relates the Q-matrix to response patterns. The flexibility of the T-matrix allows the construction of a natural criterion function as well as a computationally amenable algorithm. Simulations results are presented to demonstrate usefulness and applicability of the proposed method. Extension to handling of the Q-matrix with partial information is presented. The proposed method also provides a platform on which important statistical issues, such as hypothesis testing and model selection, may be formally addressed. PMID:23926363
A New Statistic for Evaluating Item Response Theory Models for Ordinal Data. CRESST Report 839
ERIC Educational Resources Information Center
Cai, Li; Monroe, Scott
2014-01-01
We propose a new limited-information goodness of fit test statistic C[subscript 2] for ordinal IRT models. The construction of the new statistic lies formally between the M[subscript 2] statistic of Maydeu-Olivares and Joe (2006), which utilizes first and second order marginal probabilities, and the M*[subscript 2] statistic of Cai and Hansen…
Dynamic principle for ensemble control tools.
Samoletov, A; Vasiev, B
2017-11-28
Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.
Longitudinal Effects on Early Adolescent Language: A Twin Study
DeThorne, Laura Segebart; Smith, Jamie Mahurin; Betancourt, Mariana Aparicio; Petrill, Stephen A.
2016-01-01
Purpose We evaluated genetic and environmental contributions to individual differences in language skills during early adolescence, measured by both language sampling and standardized tests, and examined the extent to which these genetic and environmental effects are stable across time. Method We used structural equation modeling on latent factors to estimate additive genetic, shared environmental, and nonshared environmental effects on variance in standardized language skills (i.e., Formal Language) and productive language-sample measures (i.e., Productive Language) in a sample of 527 twins across 3 time points (mean ages 10–12 years). Results Individual differences in the Formal Language factor were influenced primarily by genetic factors at each age, whereas individual differences in the Productive Language factor were primarily due to nonshared environmental influences. For the Formal Language factor, the stability of genetic effects was high across all 3 time points. For the Productive Language factor, nonshared environmental effects showed low but statistically significant stability across adjacent time points. Conclusions The etiology of language outcomes may differ substantially depending on assessment context. In addition, the potential mechanisms for nonshared environmental influences on language development warrant further investigation. PMID:27732720
Marino, Ricardo; Majumdar, Satya N; Schehr, Grégory; Vivo, Pierpaolo
2016-09-01
Let P_{β}^{(V)}(N_{I}) be the probability that a N×Nβ-ensemble of random matrices with confining potential V(x) has N_{I} eigenvalues inside an interval I=[a,b] on the real line. We introduce a general formalism, based on the Coulomb gas technique and the resolvent method, to compute analytically P_{β}^{(V)}(N_{I}) for large N. We show that this probability scales for large N as P_{β}^{(V)}(N_{I})≈exp[-βN^{2}ψ^{(V)}(N_{I}/N)], where β is the Dyson index of the ensemble. The rate function ψ^{(V)}(k_{I}), independent of β, is computed in terms of single integrals that can be easily evaluated numerically. The general formalism is then applied to the classical β-Gaussian (I=[-L,L]), β-Wishart (I=[1,L]), and β-Cauchy (I=[-L,L]) ensembles. Expanding the rate function around its minimum, we find that generically the number variance var(N_{I}) exhibits a nonmonotonic behavior as a function of the size of the interval, with a maximum that can be precisely characterized. These analytical results, corroborated by numerical simulations, provide the full counting statistics of many systems where random matrix models apply. In particular, we present results for the full counting statistics of zero-temperature one-dimensional spinless fermions in a harmonic trap.
Recent trends related to the use of formal methods in software engineering
NASA Technical Reports Server (NTRS)
Prehn, Soren
1986-01-01
An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Nuclear deformation in the laboratory frame
NASA Astrophysics Data System (ADS)
Gilbreth, C. N.; Alhassid, Y.; Bertsch, G. F.
2018-01-01
We develop a formalism for calculating the distribution of the axial quadrupole operator in the laboratory frame within the rotationally invariant framework of the configuration-interaction shell model. The calculation is carried out using a finite-temperature auxiliary-field quantum Monte Carlo method. We apply this formalism to isotope chains of even-mass samarium and neodymium nuclei and show that the quadrupole distribution provides a model-independent signature of nuclear deformation. Two technical advances are described that greatly facilitate the calculations. The first is to exploit the rotational invariance of the underlying Hamiltonian to reduce the statistical fluctuations in the Monte Carlo calculations. The second is to determine quadruple invariants from the distribution of the axial quadrupole operator in the laboratory frame. This allows us to extract effective values of the intrinsic quadrupole shape parameters without invoking an intrinsic frame or a mean-field approximation.
NASA Astrophysics Data System (ADS)
Welden, Alicia Rae; Rusakov, Alexander A.; Zgid, Dominika
2016-11-01
Including finite-temperature effects from the electronic degrees of freedom in electronic structure calculations of semiconductors and metals is desired; however, in practice it remains exceedingly difficult when using zero-temperature methods, since these methods require an explicit evaluation of multiple excited states in order to account for any finite-temperature effects. Using a Matsubara Green's function formalism remains a viable alternative, since in this formalism it is easier to include thermal effects and to connect the dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. However, despite the promising properties of this formalism, little is known about the multiple solutions of the non-linear equations present in the self-consistent Matsubara formalism and only a few cases involving a full Coulomb Hamiltonian were investigated in the past. Here, to shed some light onto the iterative nature of the Green's function solutions, we self-consistently evaluate the thermodynamic quantities for a one-dimensional (1D) hydrogen solid at various interatomic separations and temperatures using the self-energy approximated to second-order (GF2). At many points in the phase diagram of this system, multiple phases such as a metal and an insulator exist, and we are able to determine the most stable phase from the analysis of Helmholtz energies. Additionally, we show the evolution of the spectrum of 1D boron nitride to demonstrate that GF2 is capable of qualitatively describing the temperature effects influencing the size of the band gap.
NASA Technical Reports Server (NTRS)
1995-01-01
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.
Multivariate assessment of event-related potentials with the t-CWT method.
Bostanov, Vladimir
2015-11-05
Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.
Nonlinear estimation of parameters in biphasic Arrhenius plots.
Puterman, M L; Hrboticky, N; Innis, S M
1988-05-01
This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.
The space of ultrametric phylogenetic trees.
Gavryushkin, Alex; Drummond, Alexei J
2016-08-21
The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Dietary diversity of formal and informal residents in Johannesburg, South Africa
2013-01-01
Background This paper considers the question of dietary diversity as a proxy for nutrition insecurity in communities living in the inner city and the urban informal periphery in Johannesburg. It argues that the issue of nutrition insecurity demands urgent and immediate attention by policy makers. Methods A cross-sectional survey was undertaken for households from urban informal (n = 195) and urban formal (n = 292) areas in Johannesburg, South Africa. Foods consumed by the respondents the previous day were used to calculate a Dietary Diversity Score; a score < 4 was considered low. Results Statistical comparisons of means between groups revealed that respondents from informal settlements consumed mostly cereals and meat/poultry/fish, while respondents in formal settlements consumed a more varied diet. Significantly more respondents living in informal settlements consumed a diet of low diversity (68.1%) versus those in formal settlements (15.4%). When grouped in quintiles, two-thirds of respondents from informal settlements fell in the lowest two, versus 15.4% living in formal settlements. Households who experienced periods of food shortages during the previous 12 months had a lower mean DDS than those from food secure households (4.00 ± 1.6 versus 4.36 ± 1.7; p = 0.026). Conclusions Respondents in the informal settlements were more nutritionally vulnerable. Achieving nutrition security requires policies, strategies and plans to include specific nutrition considerations. PMID:24088249
Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.
Third NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
1995-01-01
This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.
Formalizing Space Shuttle Software Requirements
NASA Technical Reports Server (NTRS)
Crow, Judith; DiVito, Ben L.
1996-01-01
This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
NASA Astrophysics Data System (ADS)
Bommier, Véronique
2016-06-01
Context. We discuss the case of lines formed by scattering, which comprises both coherent and incoherent scattering. Both processes contribute to form the line profiles in the so-called second solar spectrum, which is the spectrum of the linear polarization of such lines observed close to the solar limb. However, most of the lines cannot be simply modeled with a two-level or two-term atom model, and we present a generalized formalism for this purpose. Aims: The aim is to obtain a formalism that is able to describe scattering in line centers (resonant scattering or incoherent scattering) and in far wings (Rayleigh/Raman scattering or coherent scattering) for a multilevel and multiline atom. Methods: The method is designed to overcome the Markov approximation, which is often performed in the atom-photon interaction description. The method was already presented in the two first papers of this series, but the final equations of those papers were for a two-level atom. Results: We present here the final equations generalized for the multilevel and multiline atom. We describe the main steps of the theoretical development, and, in particular, how we performed the series development to overcome the Markov approximation. Conclusions: The statistical equilibrium equations for the atomic density matrix and the radiative transfer equation coefficients are obtained with line profiles. The Doppler redistribution is also taken into account because we show that the statistical equilibrium equations must be solved for each atomic velocity class.
NASA Astrophysics Data System (ADS)
Aouaini, Fatma; Knani, Salah; Yahia, Manel Ben; Bahloul, Neila; Ben Lamine, Abdelmottaleb; Kechaou, Nabil
2015-12-01
In this paper, we present a new investigation that allows determining the pore size distribution (PSD) in a porous medium. This PSD is achieved by using the desorption isotherms of four varieties of olive leaves. This is by the means of statistical physics formalism and Kelvin's law. The results are compared with those obtained with scanning electron microscopy. The effect of temperature on the distribution function of pores has been studied. The influence of each parameter on the PSD is interpreted. A similar function of adsorption energy distribution, AED, is deduced from the PSD.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Pezzotti, Giuseppe; Zhu, Wenliang; Boffelli, Marco; Adachi, Tetsuya; Ichioka, Hiroaki; Yamamoto, Toshiro; Marunaka, Yoshinori; Kanamura, Narisato
2015-05-01
The Raman spectroscopic method has quantitatively been applied to the analysis of local crystallographic orientation in both single-crystal hydroxyapatite and human teeth. Raman selection rules for all the vibrational modes of the hexagonal structure were expanded into explicit functions of Euler angles in space and six Raman tensor elements (RTE). A theoretical treatment has also been put forward according to the orientation distribution function (ODF) formalism, which allows one to resolve the statistical orientation patterns of the nm-sized hydroxyapatite crystallite comprised in the Raman microprobe. Close-form solutions could be obtained for the Euler angles and their statistical distributions resolved with respect to the direction of the average texture axis. Polarized Raman spectra from single-crystalline hydroxyapatite and textured polycrystalline (teeth enamel) samples were compared, and a validation of the proposed Raman method could be obtained through confirming the agreement between RTE values obtained from different samples.
Formal methods and digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.
Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.
Venturi, D; Karniadakis, G E
2014-06-08
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.
Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems
Venturi, D.; Karniadakis, G. E.
2014-01-01
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519
NASA Formal Methods Workshop, 1990
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Compiler)
1990-01-01
The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.
Statistical Selection of Biological Models for Genome-Wide Association Analyses.
Bi, Wenjian; Kang, Guolian; Pounds, Stanley B
2018-05-24
Genome-wide association studies have discovered many biologically important associations of genes with phenotypes. Typically, genome-wide association analyses formally test the association of each genetic feature (SNP, CNV, etc) with the phenotype of interest and summarize the results with multiplicity-adjusted p-values. However, very small p-values only provide evidence against the null hypothesis of no association without indicating which biological model best explains the observed data. Correctly identifying a specific biological model may improve the scientific interpretation and can be used to more effectively select and design a follow-up validation study. Thus, statistical methodology to identify the correct biological model for a particular genotype-phenotype association can be very useful to investigators. Here, we propose a general statistical method to summarize how accurately each of five biological models (null, additive, dominant, recessive, co-dominant) represents the data observed for each variant in a GWAS study. We show that the new method stringently controls the false discovery rate and asymptotically selects the correct biological model. Simulations of two-stage discovery-validation studies show that the new method has these properties and that its validation power is similar to or exceeds that of simple methods that use the same statistical model for all SNPs. Example analyses of three data sets also highlight these advantages of the new method. An R package is freely available at www.stjuderesearch.org/site/depts/biostats/maew. Copyright © 2018. Published by Elsevier Inc.
A formal framework of scenario creation and analysis of extreme hydrological events
NASA Astrophysics Data System (ADS)
Lohmann, D.
2007-12-01
We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.
Strategies Used by Students to Compare Two Data Sets
ERIC Educational Resources Information Center
Reaburn, Robyn
2012-01-01
One of the common tasks of inferential statistics is to compare two data sets. Long before formal statistical procedures, however, students can be encouraged to make comparisons between data sets and therefore build up intuitive statistical reasoning. Such tasks also give meaning to the data collection students may do. This study describes the…
Formal faculty observation and assessment of bedside skills for 3rd-year neurology clerks
Mooney, Christopher; Wexler, Erika; Mink, Jonathan; Post, Jennifer; Jozefowicz, Ralph F.
2016-01-01
Objective: To evaluate the feasibility and utility of instituting a formalized bedside skills evaluation (BSE) for 3rd-year medical students on the neurology clerkship. Methods: A neurologic BSE was developed for 3rd-year neurology clerks at the University of Rochester for the 2012–2014 academic years. Faculty directly observed 189 students completing a full history and neurologic examination on real inpatients. Mock grades were calculated utilizing the BSE in the final grade, and number of students with a grade difference was determined when compared to true grade. Correlation was explored between the BSE and clinical scores, National Board of Medical Examiners (NBME) scores, case complexity, and true final grades. A survey was administered to students to assess their clinical skills exposure and the usefulness of the BSE. Results: Faculty completed and submitted a BSE form for 88.3% of students. There was a mock final grade change for 13.2% of students. Correlation coefficients between BSE score and clinical score/NBME score were 0.36 and 0.35, respectively. A statistically significant effect of BSE was found on final clerkship grade (F2,186 = 31.9, p < 0.0001). There was no statistical difference between BSE score and differing case complexities. Conclusions: Incorporating a formal faculty-observed BSE into the 3rd year neurology clerkship was feasible. Low correlation between BSE score and other evaluations indicated a unique measurement to contribute to student grade. Using real patients with differing case complexity did not alter the grade. PMID:27770072
Ten Commandments of Formal Methods...Ten Years Later
NASA Technical Reports Server (NTRS)
Bowen, Jonathan P.; Hinchey, Michael G.
2006-01-01
More than a decade ago, in "Ten Commandments of Formal Methods," we offered practical guidelines for projects that sought to use formal methods. Over the years, the article, which was based on our knowledge of successful industrial projects, has been widely cited and has generated much positive feedback. However, despite this apparent enthusiasm, formal methods use has not greatly increased, and some of the same attitudes about the infeasibility of adopting them persist. Formal methodists believe that introducing greater rigor will improve the software development process and yield software with better structure, greater maintainability, and fewer errors.
For a statistical interpretation of Helmholtz' thermal displacement
NASA Astrophysics Data System (ADS)
Podio-Guidugli, Paolo
2016-11-01
On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.
On the statistical distribution in a deformed solid
NASA Astrophysics Data System (ADS)
Gorobei, N. N.; Luk'yanenko, A. S.
2017-09-01
A modification of the Gibbs distribution in a thermally insulated mechanically deformed solid, where its linear dimensions (shape parameters) are excluded from statistical averaging and included among the macroscopic parameters of state alongside with the temperature, is proposed. Formally, this modification is reduced to corresponding additional conditions when calculating the statistical sum. The shape parameters and the temperature themselves are found from the conditions of mechanical and thermal equilibria of a body, and their change is determined using the first law of thermodynamics. Known thermodynamic phenomena are analyzed for the simple model of a solid, i.e., an ensemble of anharmonic oscillators, within the proposed formalism with an accuracy of up to the first order by the anharmonicity constant. The distribution modification is considered for the classic and quantum temperature regions apart.
Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T
2017-01-01
We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.
Bayesian Sensitivity Analysis of Statistical Models with Missing Data
ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG
2013-01-01
Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718
NASA Astrophysics Data System (ADS)
Zhou, Chenyi; Guo, Hong
2017-01-01
We report a diagrammatic method to solve the general problem of calculating configurationally averaged Green's function correlators that appear in quantum transport theory for nanostructures containing disorder. The theory treats both equilibrium and nonequilibrium quantum statistics on an equal footing. Since random impurity scattering is a problem that cannot be solved exactly in a perturbative approach, we combine our diagrammatic method with the coherent potential approximation (CPA) so that a reliable closed-form solution can be obtained. Our theory not only ensures the internal consistency of the diagrams derived at different levels of the correlators but also satisfies a set of Ward-like identities that corroborate the conserving consistency of transport calculations within the formalism. The theory is applied to calculate the quantum transport properties such as average ac conductance and transmission moments of a disordered tight-binding model, and results are numerically verified to high precision by comparing to the exact solutions obtained from enumerating all possible disorder configurations. Our formalism can be employed to predict transport properties of a wide variety of physical systems where disorder scattering is important.
Brennan, Jennifer Sousa
2010-01-01
This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Undergraduate Medical Students Using Facebook as a Peer-Mentoring Platform: A Mixed-Methods Study
Gradel, Maximilian; Pander, Tanja; Fischer, Martin R; von der Borch, Philip; Dimitriadis, Konstantinos
2015-01-01
Background Peer mentoring is a powerful pedagogical approach for supporting undergraduate medical students in their learning environment. However, it remains unclear what exactly peer mentoring is and whether and how undergraduate medical students use social media for peer-mentoring activities. Objective We aimed at describing and exploring the Facebook use of undergraduate medical students during their first 2 years at a German medical school. The data should help medical educators to effectively integrate social media in formal mentoring programs for medical students. Methods We developed a coding scheme for peer mentoring and conducted a mixed-methods study in order to explore Facebook groups of undergraduate medical students from a peer-mentoring perspective. Results All major peer-mentoring categories were identified in Facebook groups of medical students. The relevance of these Facebook groups was confirmed through triangulation with focus groups and descriptive statistics. Medical students made extensive use of Facebook and wrote a total of 11,853 posts and comments in the respective Facebook groups (n=2362 total group members). Posting peaks were identified at the beginning of semesters and before exam periods, reflecting the formal curriculum milestones. Conclusions Peer mentoring is present in Facebook groups formed by undergraduate medical students who extensively use these groups to seek advice from peers on study-related issues and, in particular, exam preparation. These groups also seem to be effective in supporting responsive and large-scale peer-mentoring structures; formal mentoring programs might benefit from integrating social media into their activity portfolio. PMID:27731859
Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie
2013-01-01
Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B
2006-08-01
Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Formal Methods Case Studies for DO-333
NASA Technical Reports Server (NTRS)
Cofer, Darren; Miller, Steven P.
2014-01-01
RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.
Exact Solution of the Two-Level System and the Einstein Solid in the Microcanonical Formalism
ERIC Educational Resources Information Center
Bertoldi, Dalia S.; Bringa, Eduardo M.; Miranda, E. N.
2011-01-01
The two-level system and the Einstein model of a crystalline solid are taught in every course of statistical mechanics and they are solved in the microcanonical formalism because the number of accessible microstates can be easily evaluated. However, their solutions are usually presented using the Stirling approximation to deal with factorials. In…
A Comparative Study of Pre-Service Education for Preschool Teachers in China and the United States
ERIC Educational Resources Information Center
Gong, Xin; Wang, Pengcheng
2017-01-01
This study provides a comparative analysis of the pre-service education system for preschool educators in China and the United States. Based on collected data and materials (literature, policy documents, and statistical data), we compare two areas of pre-service training: (1) the formal system; (2) the informal system. In the formal system, most…
Spectral theory of extreme statistics in birth-death systems
NASA Astrophysics Data System (ADS)
Meerson, Baruch
2008-03-01
Statistics of rare events, or large deviations, in chemical reactions and systems of birth-death type have attracted a great deal of interest in many areas of science including cell biochemistry, astrochemistry, epidemiology, population biology, etc. Large deviations become of vital importance when discrete (non-continuum) nature of a population of ``particles'' (molecules, bacteria, cells, animals or even humans) and stochastic character of interactions can drive the population to extinction. I will briefly review the novel spectral method [1-3] for calculating the extreme statistics of a broad class of birth-death processes and reactions involving a single species. The spectral method combines the probability generating function formalism with the Sturm-Liouville theory of linear differential operators. It involves a controlled perturbative treatment based on a natural large parameter of the problem: the average number of particles/individuals in a stationary or metastable state. For extinction (the first passage) problems the method yields accurate results for the extinction statistics and for the quasi-stationary probability distribution, including the tails, of metastable states. I will demonstrate the power of the method on the example of a branching and annihilation reaction, A ->-2.8mm2mm2A,,A ->-2.8mm2mm , representative of a rather general class of processes. *M. Assaf and B. Meerson, Phys. Rev. Lett. 97, 200602 (2006). *M. Assaf and B. Meerson, Phys. Rev. E 74, 041115 (2006). *M. Assaf and B. Meerson, Phys. Rev. E 75, 031122 (2007).
Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu
2017-11-10
Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management institutions and general hospitals are more active in participating in community NCD management with better collaboration score, whereas the CDC shows relatively poor collaboration in China. Xing-ming Li and Alon Rasooly have the same contribution to the paper. Xing-ming Li and Alon Rasooly listed as the same first author.
A brief overview of NASA Langley's research program in formal methods
NASA Technical Reports Server (NTRS)
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report
NASA Technical Reports Server (NTRS)
Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael
2017-01-01
This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.
Spatial analysis on future housing markets: economic development and housing implications.
Liu, Xin; Wang, Lizhe
2014-01-01
A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.
A stylistic classification of Russian-language texts based on the random walk model
NASA Astrophysics Data System (ADS)
Kramarenko, A. A.; Nekrasov, K. A.; Filimonov, V. V.; Zhivoderov, A. A.; Amieva, A. A.
2017-09-01
A formal approach to text analysis is suggested that is based on the random walk model. The frequencies and reciprocal positions of the vowel letters are matched up by a process of quasi-particle migration. Statistically significant difference in the migration parameters for the texts of different functional styles is found. Thus, a possibility of classification of texts using the suggested method is demonstrated. Five groups of the texts are singled out that can be distinguished from one another by the parameters of the quasi-particle migration process.
Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications
Liu, Xin; Wang, Lizhe
2014-01-01
A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097
NASA Technical Reports Server (NTRS)
Rybicki, G. B.; Hummer, D. G.
1991-01-01
A method is presented for solving multilevel transfer problems when nonoverlapping lines and background continuum are present and active continuum transfer is absent. An approximate lambda operator is employed to derive linear, 'preconditioned', statistical-equilibrium equations. A method is described for finding the diagonal elements of the 'true' numerical lambda operator, and therefore for obtaining the coefficients of the equations. Iterations of the preconditioned equations, in conjunction with the transfer equation's formal solution, are used to solve linear equations. Some multilevel problems are considered, including an eleven-level neutral helium atom. Diagonal and tridiagonal approximate lambda operators are utilized in the problems to examine the convergence properties of the method, and it is found to be effective for the line transfer problems.
ERIC Educational Resources Information Center
Watson, Jane
2007-01-01
Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
NASA Langley Research and Technology-Transfer Program in Formal Methods
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.
1995-01-01
This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.
ERIC Educational Resources Information Center
Trumpower, David L.
2015-01-01
Making inferences about population differences based on samples of data, that is, performing intuitive analysis of variance (IANOVA), is common in everyday life. However, the intuitive reasoning of individuals when making such inferences (even following statistics instruction), often differs from the normative logic of formal statistics. The…
NASA Astrophysics Data System (ADS)
Simonin, Olivier; Zaichik, Leonid I.; Alipchenkov, Vladimir M.; Février, Pierre
2006-12-01
The objective of the paper is to elucidate a connection between two approaches that have been separately proposed for modelling the statistical spatial properties of inertial particles in turbulent fluid flows. One of the approaches proposed recently by Février, Simonin, and Squires [J. Fluid Mech. 533, 1 (2005)] is based on the partitioning of particle turbulent velocity field into spatially correlated (mesoscopic Eulerian) and random-uncorrelated (quasi-Brownian) components. The other approach stems from a kinetic equation for the two-point probability density function of the velocity distributions of two particles [Zaichik and Alipchenkov, Phys. Fluids 15, 1776 (2003)]. Comparisons between these approaches are performed for isotropic homogeneous turbulence and demonstrate encouraging agreement.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Undergraduate Medical Students Using Facebook as a Peer-Mentoring Platform: A Mixed-Methods Study.
Pinilla, Severin; Nicolai, Leo; Gradel, Maximilian; Pander, Tanja; Fischer, Martin R; von der Borch, Philip; Dimitriadis, Konstantinos
2015-10-27
Peer mentoring is a powerful pedagogical approach for supporting undergraduate medical students in their learning environment. However, it remains unclear what exactly peer mentoring is and whether and how undergraduate medical students use social media for peer-mentoring activities. We aimed at describing and exploring the Facebook use of undergraduate medical students during their first 2 years at a German medical school. The data should help medical educators to effectively integrate social media in formal mentoring programs for medical students. We developed a coding scheme for peer mentoring and conducted a mixed-methods study in order to explore Facebook groups of undergraduate medical students from a peer-mentoring perspective. All major peer-mentoring categories were identified in Facebook groups of medical students. The relevance of these Facebook groups was confirmed through triangulation with focus groups and descriptive statistics. Medical students made extensive use of Facebook and wrote a total of 11,853 posts and comments in the respective Facebook groups (n=2362 total group members). Posting peaks were identified at the beginning of semesters and before exam periods, reflecting the formal curriculum milestones. Peer mentoring is present in Facebook groups formed by undergraduate medical students who extensively use these groups to seek advice from peers on study-related issues and, in particular, exam preparation. These groups also seem to be effective in supporting responsive and large-scale peer-mentoring structures; formal mentoring programs might benefit from integrating social media into their activity portfolio.
A bootstrapping method for development of Treebank
NASA Astrophysics Data System (ADS)
Zarei, F.; Basirat, A.; Faili, H.; Mirain, M.
2017-01-01
Using statistical approaches beside the traditional methods of natural language processing could significantly improve both the quality and performance of several natural language processing (NLP) tasks. The effective usage of these approaches is subject to the availability of the informative, accurate and detailed corpora on which the learners are trained. This article introduces a bootstrapping method for developing annotated corpora based on a complex and rich linguistically motivated elementary structure called supertag. To this end, a hybrid method for supertagging is proposed that combines both of the generative and discriminative methods of supertagging. The method was applied on a subset of Wall Street Journal (WSJ) in order to annotate its sentences with a set of linguistically motivated elementary structures of the English XTAG grammar that is using a lexicalised tree-adjoining grammar formalism. The empirical results confirm that the bootstrapping method provides a satisfactory way for annotating the English sentences with the mentioned structures. The experiments show that the method could automatically annotate about 20% of WSJ with the accuracy of F-measure about 80% of which is particularly 12% higher than the F-measure of the XTAG Treebank automatically generated from the approach proposed by Basirat and Faili [(2013). Bridge the gap between statistical and hand-crafted grammars. Computer Speech and Language, 27, 1085-1104].
Why are Formal Methods Not Used More Widely?
NASA Technical Reports Server (NTRS)
Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.
1997-01-01
Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Öztürk, Hande; Noyan, I. Cevdet
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Öztürk, Hande; Noyan, I. Cevdet
2017-08-24
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
A brief history of numbers and statistics with cytometric applications.
Watson, J V
2001-02-15
A brief history of numbers and statistics traces the development of numbers from prehistory to completion of our current system of numeration with the introduction of the decimal fraction by Viete, Stevin, Burgi, and Galileo at the turn of the 16th century. This was followed by the development of what we now know as probability theory by Pascal, Fermat, and Huygens in the mid-17th century which arose in connection with questions in gambling with dice and can be regarded as the origin of statistics. The three main probability distributions on which statistics depend were introduced and/or formalized between the mid-17th and early 19th centuries: the binomial distribution by Pascal; the normal distribution by de Moivre, Gauss, and Laplace, and the Poisson distribution by Poisson. The formal discipline of statistics commenced with the works of Pearson, Yule, and Gosset at the turn of the 19th century when the first statistical tests were introduced. Elementary descriptions of the statistical tests most likely to be used in conjunction with cytometric data are given and it is shown how these can be applied to the analysis of difficult immunofluorescence distributions when there is overlap between the labeled and unlabeled cell populations. Copyright 2001 Wiley-Liss, Inc.
Statistical inference and Aristotle's Rhetoric.
Macdonald, Ranald R
2004-11-01
Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.
A quantitative approach to evolution of music and philosophy
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano
2012-08-01
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Weaving a Formal Methods Education with Problem-Based Learning
NASA Astrophysics Data System (ADS)
Gibson, J. Paul
The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.
NASA Astrophysics Data System (ADS)
Arsenault, Louis-Francois; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.
We present a machine learning-based statistical regression approach to the inversion of Fredholm integrals of the first kind by studying an important example for the quantum materials community, the analytical continuation problem of quantum many-body physics. It involves reconstructing the frequency dependence of physical excitation spectra from data obtained at specific points in the complex frequency plane. The approach provides a natural regularization in cases where the inverse of the Fredholm kernel is ill-conditioned and yields robust error metrics. The stability of the forward problem permits the construction of a large database of input-output pairs. Machine learning methods applied to this database generate approximate solutions which are projected onto the subspace of functions satisfying relevant constraints. We show that for low input noise the method performs as well or better than Maximum Entropy (MaxEnt) under standard error metrics, and is substantially more robust to noise. We expect the methodology to be similarly effective for any problem involving a formally ill-conditioned inversion, provided that the forward problem can be efficiently solved. AJM was supported by the Office of Science of the U.S. Department of Energy under Subcontract No. 3F-3138 and LFA by the Columbia Univeristy IDS-ROADS project, UR009033-05 which also provided part support to RN and LH.
A Random Variable Approach to Nuclear Targeting and Survivability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Undem, Halvor A.
We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less
Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM
ERIC Educational Resources Information Center
Warner, Rebecca M.
2007-01-01
This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummer, G.; Garcia, A.E.; Soumpasis, D.M.
1994-10-01
To understand the functioning of living organisms on a molecular level, it is crucial to dissect the intricate interplay of the immense number of biological molecules. Most of the biochemical processes in cells occur in a liquid environment formed mainly by water and ions. This solvent environment plays an important role in biological systems. The potential-of-mean-force (PMF) formalism attempts to describe quantitatively the interactions of the solvent with biological macromolecules on the basis of an approximate statistical-mechanical representation. At its current status of development, it deals with ionic effects on the biomolecular structure and with the structural hydration of biomolecules.more » The underlying idea of the PMF formalism is to identify the dominant sources of interactions and incorporate these interactions into the theoretical formalism using PMF`s (or particle correlation functions) extracted from bulk-liquid systems. In the following, the authors shall briefly outline the statistical-mechanical foundation of the PMF formalism and introduce the PMF expansion formalism, which is intimately linked to superposition approximations for higher-order particle correlation functions. The authors shall then sketch applications, which describe the effects of the ionic environment on nucleic-acid structure. Finally, the authors shall present the more recent extension of the PMF idea to describe quantitatively the structural hydration of biomolecules. Results for the interface of ice and water and for the hydration of deoxyribonucleic acid (DNA) will be discussed.« less
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.
Ten Commandments Revisited: A Ten-Year Perspective on the Industrial Application of Formal Methods
NASA Technical Reports Server (NTRS)
Bowen, Jonathan P.; Hinchey, Michael G.
2005-01-01
Ten years ago, our 1995 paper Ten Commandments of Formal Methods suggested some guidelines to help ensure the success of a formal methods project. It proposed ten important requirements (or "commandments") for formal developers to consider and follow, based on our knowledge of several industrial application success stories, most of which have been reported in more detail in two books. The paper was surprisingly popular, is still widely referenced, and used as required reading in a number of formal methods courses. However, not all have agreed with some of our commandments, feeling that they may not be valid in the long-term. We re-examine the original commandments ten years on, and consider their validity in the light of a further decade of industrial best practice and experiences.
NASA Technical Reports Server (NTRS)
Bolton, Matthew L.; Bass, Ellen J.
2009-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
ERIC Educational Resources Information Center
Situma, David Barasa
2015-01-01
The female population in Kenya was reported at 50.05% in 2011, according to a World Bank report published in 2012. Despite this slightly higher percentage over males, women in Kenya are not well represented in education and training compared to their male counterparts (Kenya National Bureau of Statistics, 2012). The need to empower girls and women…
Visitor spending effects: assessing and showcasing America's investment in national parks
Koontz, Lynne; Cullinane Thomas, Catherine; Ziesler, Pamela; Olson, Jeffrey; Meldrum, Bret
2017-01-01
This paper provides an overview of the evolution, future, and global applicability of the U.S. National Park Service's (NPS) visitor spending effects framework and discusses the methods used to effectively communicate the economic return on investment in America's national parks. The 417 parks represent many of America's most iconic destinations: in 2016, they received a record 331 million visits. Competing federal budgetary demands necessitate that, in addition to meeting their mission to preserve unimpaired natural and cultural resources for the enjoyment of the people, parks also assess and showcase their contributions to the economic vitality of their regions and the nation. Key approaches explained include the original Money Generation Model (MGM) from 1990, MGM2 used from 2001, and the visitor spending effects model which replaced MGM2 in 2012. Detailed discussion explains the NPS's visitor use statistics system, the formal program for collecting, compiling, and reporting visitor use data. The NPS is now establishing a formal socioeconomic monitoring (SEM) program to provide a standard visitor survey instrument and a long-term, systematic sampling design for in-park visitor surveys. The pilot SEM survey is discussed, along with the need for international standardization of research methods.
Applying formal methods and object-oriented analysis to existing flight software
NASA Technical Reports Server (NTRS)
Cheng, Betty H. C.; Auernheimer, Brent
1993-01-01
Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.
Two-Step Formal Advertisement: An Examination.
1976-10-01
The purpose of this report is to examine the potential application of the Two-Step Formal Advertisement method of procurement. Emphasis is placed on...Step formal advertising is a method of procurement designed to take advantage of negotiation flexibility and at the same time obtain the benefits of...formal advertising . It is used where the specifications are not sufficiently definite or may be too restrictive to permit full and free competition
Test particle propagation in magnetostatic turbulence. 2: The local approximation method
NASA Technical Reports Server (NTRS)
Klimas, A. J.; Sandri, G.; Scudder, J. D.; Howell, D. R.
1976-01-01
An approximation method for statistical mechanics is presented and applied to a class of problems which contains a test particle propagation problem. All of the available basic equations used in statistical mechanics are cast in the form of a single equation which is integrodifferential in time and which is then used as the starting point for the construction of the local approximation method. Simplification of the integrodifferential equation is achieved through approximation to the Laplace transform of its kernel. The approximation is valid near the origin in the Laplace space and is based on the assumption of small Laplace variable. No other small parameter is necessary for the construction of this approximation method. The n'th level of approximation is constructed formally, and the first five levels of approximation are calculated explicitly. It is shown that each level of approximation is governed by an inhomogeneous partial differential equation in time with time independent operator coefficients. The order in time of these partial differential equations is found to increase as n does. At n = 0 the most local first order partial differential equation which governs the Markovian limit is regained.
Functional annotation of regulatory pathways.
Pandey, Jayesh; Koyutürk, Mehmet; Kim, Yohan; Szpankowski, Wojciech; Subramaniam, Shankar; Grama, Ananth
2007-07-01
Standardized annotations of biomolecules in interaction networks (e.g. Gene Ontology) provide comprehensive understanding of the function of individual molecules. Extending such annotations to pathways is a critical component of functional characterization of cellular signaling at the systems level. We propose a framework for projecting gene regulatory networks onto the space of functional attributes using multigraph models, with the objective of deriving statistically significant pathway annotations. We first demonstrate that annotations of pairwise interactions do not generalize to indirect relationships between processes. Motivated by this result, we formalize the problem of identifying statistically overrepresented pathways of functional attributes. We establish the hardness of this problem by demonstrating the non-monotonicity of common statistical significance measures. We propose a statistical model that emphasizes the modularity of a pathway, evaluating its significance based on the coupling of its building blocks. We complement the statistical model by an efficient algorithm and software, Narada, for computing significant pathways in large regulatory networks. Comprehensive results from our methods applied to the Escherichia coli transcription network demonstrate that our approach is effective in identifying known, as well as novel biological pathway annotations. Narada is implemented in Java and is available at http://www.cs.purdue.edu/homes/jpandey/narada/.
Primordial Black Holes from First Principles (Overview)
NASA Astrophysics Data System (ADS)
Lam, Casey; Bloomfield, Jolyon; Moss, Zander; Russell, Megan; Face, Stephen; Guth, Alan
2017-01-01
Given a power spectrum from inflation, our goal is to calculate, from first principles, the number density and mass spectrum of primordial black holes that form in the early universe. Previously, these have been calculated using the Press- Schechter formalism and some demonstrably dubious rules of thumb regarding predictions of black hole collapse. Instead, we use Monte Carlo integration methods to sample field configurations from a power spectrum combined with numerical relativity simulations to obtain a more accurate picture of primordial black hole formation. We demonstrate how this can be applied for both Gaussian perturbations and the more interesting (for primordial black holes) theory of hybrid inflation. One of the tools that we employ is a variant of the BBKS formalism for computing the statistics of density peaks in the early universe. We discuss the issue of overcounting due to subpeaks that can arise from this approach (the ``cloud-in-cloud'' problem). MIT UROP Office- Paul E. Gray (1954) Endowed Fund.
Formal Models of the Network Co-occurrence Underlying Mental Operations.
Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand
2016-06-01
Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition.
Formal Models of the Network Co-occurrence Underlying Mental Operations
Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand
2016-01-01
Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition. PMID:27310288
Discrete Mathematical Approaches to Graph-Based Traffic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Cowley, Wendy E.; Hogan, Emilie A.
2014-04-01
Modern cyber defense and anlaytics requires general, formal models of cyber systems. Multi-scale network models are prime candidates for such formalisms, using discrete mathematical methods based in hierarchically-structured directed multigraphs which also include rich sets of labels. An exemplar of an application of such an approach is traffic analysis, that is, observing and analyzing connections between clients, servers, hosts, and actors within IP networks, over time, to identify characteristic or suspicious patterns. Towards that end, NetFlow (or more generically, IPFLOW) data are available from routers and servers which summarize coherent groups of IP packets flowing through the network. In thismore » paper, we consider traffic analysis of Netflow using both basic graph statistics and two new mathematical measures involving labeled degree distributions and time interval overlap measures. We do all of this over the VAST test data set of 96M synthetic Netflow graph edges, against which we can identify characteristic patterns of simulated ground-truth network attacks.« less
Bayesian inference of physiologically meaningful parameters from body sway measurements.
Tietäväinen, A; Gutmann, M U; Keski-Vakkuri, E; Corander, J; Hæggström, E
2017-06-19
The control of the human body sway by the central nervous system, muscles, and conscious brain is of interest since body sway carries information about the physiological status of a person. Several models have been proposed to describe body sway in an upright standing position, however, due to the statistical intractability of the more realistic models, no formal parameter inference has previously been conducted and the expressive power of such models for real human subjects remains unknown. Using the latest advances in Bayesian statistical inference for intractable models, we fitted a nonlinear control model to posturographic measurements, and we showed that it can accurately predict the sway characteristics of both simulated and real subjects. Our method provides a full statistical characterization of the uncertainty related to all model parameters as quantified by posterior probability density functions, which is useful for comparisons across subjects and test settings. The ability to infer intractable control models from sensor data opens new possibilities for monitoring and predicting body status in health applications.
Formal Methods for Verification and Validation of Partial Specifications: A Case Study
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
The Standard Model in the history of the Natural Sciences, Econometrics, and the social sciences
NASA Astrophysics Data System (ADS)
Fisher, W. P., Jr.
2010-07-01
In the late 18th and early 19th centuries, scientists appropriated Newton's laws of motion as a model for the conduct of any other field of investigation that would purport to be a science. This early form of a Standard Model eventually informed the basis of analogies for the mathematical expression of phenomena previously studied qualitatively, such as cohesion, affinity, heat, light, electricity, and magnetism. James Clerk Maxwell is known for his repeated use of a formalized version of this method of analogy in lectures, teaching, and the design of experiments. Economists transferring skills learned in physics made use of the Standard Model, especially after Maxwell demonstrated the value of conceiving it in abstract mathematics instead of as a concrete and literal mechanical analogy. Haavelmo's probability approach in econometrics and R. Fisher's Statistical Methods for Research Workers brought a statistical approach to bear on the Standard Model, quietly reversing the perspective of economics and the social sciences relative to that of physics. Where physicists, and Maxwell in particular, intuited scientific method as imposing stringent demands on the quality and interrelations of data, instruments, and theory in the name of inferential and comparative stability, statistical models and methods disconnected theory from data by removing the instrument as an essential component. New possibilities for reconnecting economics and the social sciences to Maxwell's sense of the method of analogy are found in Rasch's probabilistic models for measurement.
NASA Astrophysics Data System (ADS)
Zhou, Shiqi
2004-07-01
A universal formalism, which enables calculation of solvent-mediated potential (SMP) between two equal or non-equal solute particles with any shape immersed in solvent reservior consisting of atomic particle and/or polymer chain or their mixture, is proposed by importing a density functional theory externally into OZ equation systems. Only if size asymmetry of the solvent bath components is moderate, the present formalism can calculate the SMP in any complex fluids at the present development stage of statistical mechanics, and therefore avoids all of limitations of previous approaches for SMP. Preliminary calculation indicates the reliability of the present formalism.
Formal hardware verification of digital circuits
NASA Technical Reports Server (NTRS)
Joyce, J.; Seger, C.-J.
1991-01-01
The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.
Chan, Y; Walmsley, R P
1997-12-01
When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.
Statistical mechanics of the Huxley-Simmons model
NASA Astrophysics Data System (ADS)
Caruel, M.; Truskinovsky, L.
2016-06-01
The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.
P values are only an index to evidence: 20th- vs. 21st-century statistical science.
Burnham, K P; Anderson, D R
2014-03-01
Early statistical methods focused on pre-data probability statements (i.e., data as random variables) such as P values; these are not really inferences nor are P values evidential. Statistical science clung to these principles throughout much of the 20th century as a wide variety of methods were developed for special cases. Looking back, it is clear that the underlying paradigm (i.e., testing and P values) was weak. As Kuhn (1970) suggests, new paradigms have taken the place of earlier ones: this is a goal of good science. New methods have been developed and older methods extended and these allow proper measures of strength of evidence and multimodel inference. It is time to move forward with sound theory and practice for the difficult practical problems that lie ahead. Given data the useful foundation shifts to post-data probability statements such as model probabilities (Akaike weights) or related quantities such as odds ratios and likelihood intervals. These new methods allow formal inference from multiple models in the a prior set. These quantities are properly evidential. The past century was aimed at finding the "best" model and making inferences from it. The goal in the 21st century is to base inference on all the models weighted by their model probabilities (model averaging). Estimates of precision can include model selection uncertainty leading to variances conditional on the model set. The 21st century will be about the quantification of information, proper measures of evidence, and multi-model inference. Nelder (1999:261) concludes, "The most important task before us in developing statistical science is to demolish the P-value culture, which has taken root to a frightening extent in many areas of both pure and applied science and technology".
Blended particle filters for large-dimensional chaotic dynamical systems
Majda, Andrew J.; Qi, Di; Sapsis, Themistoklis P.
2014-01-01
A major challenge in contemporary data science is the development of statistically accurate particle filters to capture non-Gaussian features in large-dimensional chaotic dynamical systems. Blended particle filters that capture non-Gaussian features in an adaptively evolving low-dimensional subspace through particles interacting with evolving Gaussian statistics on the remaining portion of phase space are introduced here. These blended particle filters are constructed in this paper through a mathematical formalism involving conditional Gaussian mixtures combined with statistically nonlinear forecast models compatible with this structure developed recently with high skill for uncertainty quantification. Stringent test cases for filtering involving the 40-dimensional Lorenz 96 model with a 5-dimensional adaptive subspace for nonlinear blended filtering in various turbulent regimes with at least nine positive Lyapunov exponents are used here. These cases demonstrate the high skill of the blended particle filter algorithms in capturing both highly non-Gaussian dynamical features as well as crucial nonlinear statistics for accurate filtering in extreme filtering regimes with sparse infrequent high-quality observations. The formalism developed here is also useful for multiscale filtering of turbulent systems and a simple application is sketched below. PMID:24825886
Frequency-resolved Monte Carlo.
López Carreño, Juan Camilo; Del Valle, Elena; Laussy, Fabrice P
2018-05-03
We adapt the Quantum Monte Carlo method to the cascaded formalism of quantum optics, allowing us to simulate the emission of photons of known energy. Statistical processing of the photon clicks thus collected agrees with the theory of frequency-resolved photon correlations, extending the range of applications based on correlations of photons of prescribed energy, in particular those of a photon-counting character. We apply the technique to autocorrelations of photon streams from a two-level system under coherent and incoherent pumping, including the Mollow triplet regime where we demonstrate the direct manifestation of leapfrog processes in producing an increased rate of two-photon emission events.
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Verifying Hybrid Systems Modeled as Timed Automata: A Case Study
1997-03-01
Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
Robust functional statistics applied to Probability Density Function shape screening of sEMG data.
Boudaoud, S; Rix, H; Al Harrach, M; Marin, F
2014-01-01
Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.
From the necessary to the possible: the genesis of the spin-statistics theorem
NASA Astrophysics Data System (ADS)
Blum, Alexander
2014-12-01
The spin-statistics theorem, which relates the intrinsic angular momentum of a single particle to the type of quantum statistics obeyed by a system of many such particles, is one of the central theorems in quantum field theory and the physics of elementary particles. It was first formulated in 1939/40 by Wolfgang Pauli and his assistant Markus Fierz. This paper discusses the developments that led up to this first formulation, starting from early attempts in the late 1920s to explain why charged matter particles obey Fermi-Dirac statistics, while photons obey Bose-Einstein statistics. It is demonstrated how several important developments paved the way from such general philosophical musings to a general (and provable) theorem, most notably the use of quantum field theory, the discovery of new elementary particles, and the generalization of the notion of spin. It is also discussed how the attempts to prove a spin-statistics connection were driven by Pauli from formal to more physical arguments, culminating in Pauli's 1940 proof. This proof was a major success for the beleaguered theory of quantum field theory and the methods Pauli employed proved essential for the renaissance of quantum field theory and the development of renormalization techniques in the late 1940s.
On sufficient statistics of least-squares superposition of vector sets.
Konagurthu, Arun S; Kasarapu, Parthan; Allison, Lloyd; Collier, James H; Lesk, Arthur M
2015-06-01
The problem of superposition of two corresponding vector sets by minimizing their sum-of-squares error under orthogonal transformation is a fundamental task in many areas of science, notably structural molecular biology. This problem can be solved exactly using an algorithm whose time complexity grows linearly with the number of correspondences. This efficient solution has facilitated the widespread use of the superposition task, particularly in studies involving macromolecular structures. This article formally derives a set of sufficient statistics for the least-squares superposition problem. These statistics are additive. This permits a highly efficient (constant time) computation of superpositions (and sufficient statistics) of vector sets that are composed from its constituent vector sets under addition or deletion operation, where the sufficient statistics of the constituent sets are already known (that is, the constituent vector sets have been previously superposed). This results in a drastic improvement in the run time of the methods that commonly superpose vector sets under addition or deletion operations, where previously these operations were carried out ab initio (ignoring the sufficient statistics). We experimentally demonstrate the improvement our work offers in the context of protein structural alignment programs that assemble a reliable structural alignment from well-fitting (substructural) fragment pairs. A C++ library for this task is available online under an open-source license.
Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.
2006-01-01
NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.
2011-01-01
Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440
A Survey of Logic Formalisms to Support Mishap Analysis
NASA Technical Reports Server (NTRS)
Johnson, Chris; Holloway, C. M.
2003-01-01
Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.
NASA Technical Reports Server (NTRS)
Jamsek, Damir A.
1993-01-01
A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.
Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Roberts, Larry W.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
A threshold method for immunological correlates of protection
2013-01-01
Background Immunological correlates of protection are biological markers such as disease-specific antibodies which correlate with protection against disease and which are measurable with immunological assays. It is common in vaccine research and in setting immunization policy to rely on threshold values for the correlate where the accepted threshold differentiates between individuals who are considered to be protected against disease and those who are susceptible. Examples where thresholds are used include development of a new generation 13-valent pneumococcal conjugate vaccine which was required in clinical trials to meet accepted thresholds for the older 7-valent vaccine, and public health decision making on vaccination policy based on long-term maintenance of protective thresholds for Hepatitis A, rubella, measles, Japanese encephalitis and others. Despite widespread use of such thresholds in vaccine policy and research, few statistical approaches have been formally developed which specifically incorporate a threshold parameter in order to estimate the value of the protective threshold from data. Methods We propose a 3-parameter statistical model called the a:b model which incorporates parameters for a threshold and constant but different infection probabilities below and above the threshold estimated using profile likelihood or least squares methods. Evaluation of the estimated threshold can be performed by a significance test for the existence of a threshold using a modified likelihood ratio test which follows a chi-squared distribution with 3 degrees of freedom, and confidence intervals for the threshold can be obtained by bootstrapping. The model also permits assessment of relative risk of infection in patients achieving the threshold or not. Goodness-of-fit of the a:b model may be assessed using the Hosmer-Lemeshow approach. The model is applied to 15 datasets from published clinical trials on pertussis, respiratory syncytial virus and varicella. Results Highly significant thresholds with p-values less than 0.01 were found for 13 of the 15 datasets. Considerable variability was seen in the widths of confidence intervals. Relative risks indicated around 70% or better protection in 11 datasets and relevance of the estimated threshold to imply strong protection. Goodness-of-fit was generally acceptable. Conclusions The a:b model offers a formal statistical method of estimation of thresholds differentiating susceptible from protected individuals which has previously depended on putative statements based on visual inspection of data. PMID:23448322
Properties of a Formal Method to Model Emergence in Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.
Formal Methods of V&V of Partial Specifications: An Experience Report
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
NASA Astrophysics Data System (ADS)
Nagai, Tetsuro
2017-01-01
Replica-exchange molecular dynamics (REMD) has demonstrated its efficiency by combining trajectories of a wide range of temperatures. As an extension of the method, the author formalizes the mass-manipulating replica-exchange molecular dynamics (MMREMD) method that allows for arbitrary mass scaling with respect to temperature and individual particles. The formalism enables the versatile application of mass-scaling approaches to the REMD method. The key change introduced in the novel formalism is the generalized rules for the velocity and momentum scaling after accepted replica-exchange attempts. As an application of this general formalism, the refinement of the viscosity-REMD (V-REMD) method [P. H. Nguyen,
Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop
NASA Technical Reports Server (NTRS)
Rozier, Kristin Yvonne (Editor)
2008-01-01
Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
A Tool for Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.
ERIC Educational Resources Information Center
Hilbig, Benjamin E.
2012-01-01
Extending the well-established negativity bias in human cognition to truth judgments, it was recently shown that negatively framed statistical statements are more likely to be considered true than formally equivalent statements framed positively. However, the underlying processes responsible for this effect are insufficiently understood.…
Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design
ERIC Educational Resources Information Center
Bennett, Kimberley Ann
2015-01-01
Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…
Statistics of primordial density perturbations from discrete seed masses
NASA Technical Reports Server (NTRS)
Scherrer, Robert J.; Bertschinger, Edmund
1991-01-01
The statistics of density perturbations for general distributions of seed masses with arbitrary matter accretion is examined. Formal expressions for the power spectrum, the N-point correlation functions, and the density distribution function are derived. These results are applied to the case of uncorrelated seed masses, and power spectra are derived for accretion of both hot and cold dark matter plus baryons. The reduced moments (cumulants) of the density distribution are computed and used to obtain a series expansion for the density distribution function. Analytic results are obtained for the density distribution function in the case of a distribution of seed masses with a spherical top-hat accretion pattern. More generally, the formalism makes it possible to give a complete characterization of the statistical properties of any random field generated from a discrete linear superposition of kernels. In particular, the results can be applied to density fields derived by smoothing a discrete set of points with a window function.
NASA Astrophysics Data System (ADS)
Arena, Dylan A.; Schwartz, Daniel L.
2014-08-01
Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.
Superstatistics with different kinds of distributions in the deformed formalism
NASA Astrophysics Data System (ADS)
Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.
2018-03-01
In this article, after first introducing superstatistics, the effective Boltzmann factor in a deformed formalism for modified Dirac delta, uniform, two-level and Gamma distributions is derived. Then we make use of the superstatistics for four important problems in physics and the thermodynamic properties of the system are calculated. All results in the limit case are reduced to ordinary statistical mechanics. Furthermore, effects of all parameters in the problems are calculated and shown graphically.
Weatherbee, Andrew; Sugita, Mitsuro; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex
2016-06-15
The distribution of backscattered intensities as described by the probability density function (PDF) of tissue-scattered light contains information that may be useful for tissue assessment and diagnosis, including characterization of its pathology. In this Letter, we examine the PDF description of the light scattering statistics in a well characterized tissue-like particulate medium using optical coherence tomography (OCT). It is shown that for low scatterer density, the governing statistics depart considerably from a Gaussian description and follow the K distribution for both OCT amplitude and intensity. The PDF formalism is shown to be independent of the scatterer flow conditions; this is expected from theory, and suggests robustness and motion independence of the OCT amplitude (and OCT intensity) PDF metrics in the context of potential biomedical applications.
NASA Astrophysics Data System (ADS)
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better understanding of the behavior of these systems.
Jones, Hayley E; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J; Baker, David R; Ades, A E
2014-07-15
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these 'back-calculations', the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. Copyright © 2014. Published by Elsevier B.V.
Jones, Hayley E.; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J.; Baker, David R.; Ades, A.E.
2014-01-01
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these ‘back-calculations’, the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. PMID:24636801
Teif, Vladimir B
2007-01-01
The transfer matrix methodology is proposed as a systematic tool for the statistical-mechanical description of DNA-protein-drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the O(R) operator of phage lambda. The transfer matrix formalism allowed the description of the lambda-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI-Cro-RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the O(R) and O(L) operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters P(R) and P(RM) becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
Theoretical approaches to the steady-state statistical physics of interacting dissipative units
NASA Astrophysics Data System (ADS)
Bertin, Eric
2017-02-01
The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.
THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...
CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
Quantum formalism for classical statistics
NASA Astrophysics Data System (ADS)
Wetterich, C.
2018-06-01
In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.
Applications of Formal Methods to Specification and Safety of Avionics Software
NASA Technical Reports Server (NTRS)
Hoover, D. N.; Guaspari, David; Humenn, Polar
1996-01-01
This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbert, J.H.
This brief note describes the probabilistic structure of the Arps/Roberts (A/R) model of petroleum discovery. A model similar to the A/R model is derived from probabilistic propositions demonstrated to be similar to the E. Barouch/G.M. Kaufman (B/K) model, and also demonstrated to be similar to the Drew, Schuenemeyer, and Root (D/S/R) model. This note attempts to elucidate and to simplify some fundamental ideas contained in an unpublished paper by Barouch and Kaufman. This note and its predecessor paper does not attempt to address a wide variety of statistical approaches for estimating petroleum resource availability. Rather, an attempt is made tomore » draw attention to characteristics of certain methods that are commonly used, both formally and informally, to estimate a petroleum resource base for a basin or a nation. Some of these characteristics are statistical, but many are not, except in the broadest sense of the term.« less
Tsallis and Kaniadakis statistics from a point of view of the holographic equipartition law
NASA Astrophysics Data System (ADS)
Abreu, Everton M. C.; Ananias Neto, Jorge; Mendes, Albert C. R.; Bonilla, Alexander
2018-02-01
In this work, we have illustrated the difference between both Tsallis and Kaniadakis entropies through cosmological models obtained from the formalism proposed by Padmanabhan, which is called holographic equipartition law. Similarly to the formalism proposed by Komatsu, we have obtained an extra driving constant term in the Friedmann equation if we deform the Tsallis entropy by Kaniadakis' formalism. We have considered initially Tsallis entropy as the black-hole (BH) area entropy. This constant term may lead the universe to be in an accelerated or decelerated mode. On the other hand, if we start with the Kaniadakis entropy as the BH area entropy and then by modifying the Kappa expression by Tsallis' formalism, the same absolute value but with opposite sign is obtained. In an opposite limit, no driving inflation term of the early universe was derived from both deformations.
A new and trustworthy formalism to compute entropy in quantum systems
NASA Astrophysics Data System (ADS)
Ansari, Mohammad
Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.
Formal methods and their role in digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1995-01-01
This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.
Cost implications of organizing nursing home workforce in teams.
Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena
2009-08-01
To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs.
Correcting for population structure and kinship using the linear mixed model: theory and extensions.
Hoffman, Gabriel E
2013-01-01
Population structure and kinship are widespread confounding factors in genome-wide association studies (GWAS). It has been standard practice to include principal components of the genotypes in a regression model in order to account for population structure. More recently, the linear mixed model (LMM) has emerged as a powerful method for simultaneously accounting for population structure and kinship. The statistical theory underlying the differences in empirical performance between modeling principal components as fixed versus random effects has not been thoroughly examined. We undertake an analysis to formalize the relationship between these widely used methods and elucidate the statistical properties of each. Moreover, we introduce a new statistic, effective degrees of freedom, that serves as a metric of model complexity and a novel low rank linear mixed model (LRLMM) to learn the dimensionality of the correction for population structure and kinship, and we assess its performance through simulations. A comparison of the results of LRLMM and a standard LMM analysis applied to GWAS data from the Multi-Ethnic Study of Atherosclerosis (MESA) illustrates how our theoretical results translate into empirical properties of the mixed model. Finally, the analysis demonstrates the ability of the LRLMM to substantially boost the strength of an association for HDL cholesterol in Europeans.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.
Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie
2016-06-01
Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.
A Natural Language Interface Concordant with a Knowledge Base.
Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young
2016-01-01
The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively.
ERIC Educational Resources Information Center
Wulff, Shaun S.; Wulff, Donald H.
2004-01-01
This article focuses on one instructor's evolution from formal lecturing to interactive teaching and learning in a statistics course. Student perception data are used to demonstrate the instructor's use of communication to align the content, students, and instructor throughout the course. Results indicate that the students learned, that…
Verification of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.
Onyeonoro, Ugochukwu U.; Ogah, Okechukwu S.; Ukegbu, Andrew U.; Chukwuonye, Innocent I.; Madukwe, Okechukwu O.; Moses, Akhimiem O.
2016-01-01
BACKGROUND Understanding the differences in care-seeking pattern is key in designing interventions aimed at improving health-care service delivery, including prevention and control of noncommunicable diseases. The aim of this study was to identify the differences and determinants of care-seeking patterns of urban and rural residents in Abia State in southeast Nigeria. METHODS This was a cross-sectional, community-based, study involving 2999 respondents aged 18 years and above. Data were collected using the modified World Health Organization’s STEPS questionnaire, including data on care seeking following the onset of illness. Descriptive statistics and logistic regressions were used to analyze care-seeking behavior and to identify differences among those seeking care in urban and rural areas. RESULTS In both urban and rural areas, patent medicine vendors (73.0%) were the most common sources of primary care following the onset of illness, while only 20.0% of the participants used formal care. Significant predictors of difference in care-seeking practices between residents in urban and rural communities were educational status, income, occupation, and body mass index. CONCLUSIONS Efforts should be made to reduce barriers to formal health-care service utilization in the state by increasing health insurance coverage, strengthening the health-care system, and increasing the role of patent medicine vendors in the formal health-care delivery system. PMID:27721654
Richter, R; Hartmann, A; Meyer, A E; Rüger, U
1994-01-01
Thomas and Schmitz claim that they "deliver a proof for the effectiveness of humanistic methods" (p. 25) with their study. However, they did not or were not able to verify their claim due to several reasons: The authors did not say if and if so to what extent the treatments carried out within the framework of the TK-regulation were treatments using humanistic methods. The validity of the only criterium used by the authors, the average duration of the inability to work, must be questioned. The inferential statistical treatment of the data is insufficient; a non-parametrical evaluation is necessary. Especially missing are personal details concerning the treatment groups (age, sex, occupation, method, duration and frequency of therapy), which are indispensable for a differentiated interpretation. In addition there are numerous formal faults (wrong quotations, mistakes in tables, unclear terms etc.). In view of this criticism we come to the conclusion that the results are to a large degree worthless, at least until several of our objections have been refuted by further information and adequate inferential statistical methods. This study is especially unsuitable to prove a however defined "effectiveness of out-patient psychotherapies", therefore also not suitable to prove the effectiveness of those treatments conducted within the framework of the TK-regulation and especially not suitable to prove the superiority of humanistic methods in comparison with psychoanalytic methods and behavioural therapy.
Hemmer, Paul A; Dadekian, Gregory A; Terndrup, Christopher; Pangaro, Louis N; Weisbrod, Allison B; Corriere, Mark D; Rodriguez, Rechell; Short, Patricia; Kelly, William F
2015-09-01
Face-to-face formal evaluation sessions between clerkship directors and faculty can facilitate the collection of trainee performance data and provide frame-of-reference training for faculty. We hypothesized that ambulatory faculty who attended evaluation sessions at least once in an academic year (attendees) would use the Reporter-Interpreter-Manager/Educator (RIME) terminology more appropriately than faculty who did not attend evaluation sessions (non-attendees). Investigators conducted a retrospective cohort study using the narrative assessments of ambulatory internal medicine clerkship students during the 2008-2009 academic year. The study included assessments of 49 clerkship medical students, which comprised 293 individual teacher narratives. Single-teacher written and transcribed verbal comments about student performance were masked and reviewed by a panel of experts who, by consensus, (1) determined whether RIME was used, (2) counted the number of RIME utterances, and (3) assigned a grade based on the comments. Analysis included descriptive statistics and Pearson correlation coefficients. The authors reviewed 293 individual teacher narratives regarding the performance of 49 students. Attendees explicitly used RIME more frequently than non-attendees (69.8 vs. 40.4 %; p < 0.0001). Grades recommended by attendees correlated more strongly with grades assigned by experts than grades recommended by non-attendees (r = 0.72; 95 % CI (0.65, 0.78) vs. 0.47; 95 % CI (0.26, 0.64); p = 0.005). Grade recommendations from individual attendees and non-attendees each correlated significantly with overall student clerkship clinical performance [r = 0.63; 95 % CI (0.54, 0.71) vs. 0.52 (0.36, 0.66), respectively], although the difference between the groups was not statistically significant (p = 0.21). On an ambulatory clerkship, teachers who attended evaluation sessions used RIME terminology more frequently and provided more accurate grade recommendations than teachers who did not attend. Formal evaluation sessions may provide frame-of-reference training for the RIME framework, a method that improves the validity and reliability of workplace assessment.
NASA Astrophysics Data System (ADS)
Zheng, Mingfang; He, Cunfu; Lu, Yan; Wu, Bin
2018-01-01
We presented a numerical method to solve phase dispersion curve in general anisotropic plates. This approach involves an exact solution to the problem in the form of the Legendre polynomial of multiple integrals, which we substituted into the state-vector formalism. In order to improve the efficiency of the proposed method, we made a special effort to demonstrate the analytical methodology. Furthermore, we analyzed the algebraic symmetries of the matrices in the state-vector formalism for anisotropic plates. The basic feature of the proposed method was the expansion of field quantities by Legendre polynomials. The Legendre polynomial method avoid to solve the transcendental dispersion equation, which can only be solved numerically. This state-vector formalism combined with Legendre polynomial expansion distinguished the adjacent dispersion mode clearly, even when the modes were very close. We then illustrated the theoretical solutions of the dispersion curves by this method for isotropic and anisotropic plates. Finally, we compared the proposed method with the global matrix method (GMM), which shows excellent agreement.
On the theory of Carriers's Electrostatic Interaction near an Interface
NASA Astrophysics Data System (ADS)
Waters, Michael; Hashemi, Hossein; Kieffer, John
2015-03-01
Heterojunction interfaces are common in non-traditional photovoltaic device designs, such as those based small molecules, polymers, and perovskites. We have examined a number of the effects of the heterojunction interface region on carrier/exciton energetics using a mixture of both semi-classical and quantum electrostatic methods, ab initio methods, and statistical mechanics. Our theoretical analysis has yielded several useful relationships and numerical recipes that should be considered in device design regardless of the particular materials system. As a demonstration, we highlight these formalisms as applied to carriers and polaron pairs near a C60/subphthalocyanine interface. On the regularly ordered areas of the heterojunction, the effect of the interface is a significant set of corrections to the carrier energies, which in turn directly affects device performance.
IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation
NASA Technical Reports Server (NTRS)
Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.
2005-01-01
This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Condensate statistics in interacting and ideal dilute bose gases
Kocharovsky; Kocharovsky; Scully
2000-03-13
We obtain analytical formulas for the statistics, in particular, for the characteristic function and all cumulants, of the Bose-Einstein condensate in dilute weakly interacting and ideal equilibrium gases in the canonical ensemble via the particle-number-conserving operator formalism of Girardeau and Arnowitt. We prove that the ground-state occupation statistics is not Gaussian even in the thermodynamic limit. We calculate the effect of Bogoliubov coupling on suppression of ground-state occupation fluctuations and show that they are governed by a pair-correlation, squeezing mechanism.
Many-body formalism for fermions: The partition function
NASA Astrophysics Data System (ADS)
Watson, D. K.
2017-09-01
The partition function, a fundamental tenet in statistical thermodynamics, contains in principle all thermodynamic information about a system. It encapsulates both microscopic information through the quantum energy levels and statistical information from the partitioning of the particles among the available energy levels. For identical particles, this statistical accounting is complicated by the symmetry requirements of the allowed quantum states. In particular, for Fermi systems, the enforcement of the Pauli principle is typically a numerically demanding task, responsible for much of the cost of the calculations. The interplay of these three elements—the structure of the many-body spectrum, the statistical partitioning of the N particles among the available levels, and the enforcement of the Pauli principle—drives the behavior of mesoscopic and macroscopic Fermi systems. In this paper, we develop an approach for the determination of the partition function, a numerically difficult task, for systems of strongly interacting identical fermions and apply it to a model system of harmonically confined, harmonically interacting fermions. This approach uses a recently introduced many-body method that is an extension of the symmetry-invariant perturbation method (SPT) originally developed for bosons. It uses group theory and graphical techniques to avoid the heavy computational demands of conventional many-body methods which typically scale exponentially with the number of particles. The SPT application of the Pauli principle is trivial to implement since it is done "on paper" by imposing restrictions on the normal-mode quantum numbers at first order in the perturbation. The method is applied through first order and represents an extension of the SPT method to excited states. Our method of determining the partition function and various thermodynamic quantities is accurate and efficient and has the potential to yield interesting insight into the role played by the Pauli principle and the influence of large degeneracies on the emergence of the thermodynamic behavior of large-N systems.
Systems, methods and apparatus for verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.
Chemodetection in fluctuating environments: receptor coupling, buffering, and antagonism.
Lalanne, Jean-Benoît; François, Paul
2015-02-10
Variability in the chemical composition of the extracellular environment can significantly degrade the ability of cells to detect rare cognate ligands. Using concepts from statistical detection theory, we formalize the generic problem of detection of small concentrations of ligands in a fluctuating background of biochemically similar ligands binding to the same receptors. We discover that in contrast with expectations arising from considerations of signal amplification, inhibitory interactions between receptors can improve detection performance in the presence of substantial environmental variability, providing an adaptive interpretation to the phenomenon of ligand antagonism. Our results suggest that the structure of signaling pathways responsible for chemodetection in fluctuating and heterogeneous environments might be optimized with respect to the statistics and dynamics of environmental composition. The developed formalism stresses the importance of characterizing nonspecific interactions to understand function in signaling pathways.
Haghshenasfard, Zahra; Cottam, M G
2017-05-17
A microscopic (Hamiltonian-based) method for the quantum statistics of bosonic excitations in a two-mode magnon system is developed. Both the exchange and the dipole-dipole interactions, as well as the Zeeman term for an external applied field, are included in the spin Hamiltonian, and the model also contains the nonlinear effects due to parallel pumping and four-magnon interactions. The quantization of spin operators is achieved through the Holstein-Primakoff formalism, and then a coherent magnon state representation is used to study the occupation magnon number and the quantum statistical behaviour of the system. Particular attention is given to the cross correlation between the two coupled magnon modes in a ferromagnetic nanowire geometry formed by two lines of spins. Manipulation of the collapse-and-revival phenomena for the temporal evolution of the magnon number as well as the control of the cross correlation between the two magnon modes is demonstrated by tuning the parallel pumping field amplitude. The role of the four-magnon interactions is particularly interesting and leads to anti-correlation in some cases with coherent states.
Wooley, Dennis S; Kinner, Tracy J
2016-11-01
The purpose was to compare perceived self-management practices of adult type 2 diabetic patients after completing an American Diabetes Association (ADA) certified diabetes self-management education (DSME) program with unstructured individualized nurse practitioner led DSME. Demographic questions and the Self-Care Inventory-Revised (SCIR) were given to two convenience sample patient groups comprising a formal DSME program group and a group within a clinical setting who received informal and unstructured individual education during patient encounters. A t-test was executed between the formal ADA certified education sample and the informal sample's SCI-R individual scores. A second t-test was performed between the two samples' SCI-R mean scores. A t-test determined no statistically significant difference between the formal ADA structured education and informal education samples' SCI-R individual scores. There was not a statistically significant difference between the samples' SCI-R mean scores. The study results suggest that there are not superior DSME settings and instructional approaches. Copyright © 2016 Elsevier Inc. All rights reserved.
Why Engineers Should Consider Formal Methods
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
1997-01-01
This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.
Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo
2015-08-01
Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Evidence Arguments for Using Formal Methods in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh
2013-01-01
We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.
Thermal quantum time-correlation functions from classical-like dynamics
NASA Astrophysics Data System (ADS)
Hele, Timothy J. H.
2017-07-01
Thermal quantum time-correlation functions are of fundamental importance in quantum dynamics, allowing experimentally measurable properties such as reaction rates, diffusion constants and vibrational spectra to be computed from first principles. Since the exact quantum solution scales exponentially with system size, there has been considerable effort in formulating reliable linear-scaling methods involving exact quantum statistics and approximate quantum dynamics modelled with classical-like trajectories. Here, we review recent progress in the field with the development of methods including centroid molecular dynamics , ring polymer molecular dynamics (RPMD) and thermostatted RPMD (TRPMD). We show how these methods have recently been obtained from 'Matsubara dynamics', a form of semiclassical dynamics which conserves the quantum Boltzmann distribution. We also apply the Matsubara formalism to reaction rate theory, rederiving t → 0+ quantum transition-state theory (QTST) and showing that Matsubara-TST, like RPMD-TST, is equivalent to QTST. We end by surveying areas for future progress.
Statistical definition of relapse: case of family drug court.
Alemi, Farrokh; Haack, Mary; Nemes, Susanna
2004-06-01
At any point in time, a patient's return to drug use can be seen either as a temporary event or as a return to persistent use. There is no formal standard for distinguishing persistent drug use from an occasional relapse. This lack of standardization persists although the consequences of either interpretation can be life altering. In a drug court or regulatory situation, for example, misinterpreting relapse as return to drug use could lead to incarceration, loss of child custody, or loss of employment. A clinician who mistakes a client's relapse for persistent drug use may fail to adjust treatment intensity to client's needs. An empirical and standardized method for distinguishing relapse from persistent drug use is needed. This paper provides a tool for clinicians and judges to distinguish relapse from persistent use based on statistical analyses of patterns of client's drug use. To accomplish this, a control chart is created for time-in-between relapses. This paper shows how a statistical limit can be calculated by examining either the client's history or other clients in the same program. If client's time-in-between relapse exceeds the statistical limit, then the client has returned to persistent use. Otherwise, the drug use is temporary. To illustrate the method, it is applied to data from three family drug courts. The approach allows the estimation of control limits based on the client's as well as the court's historical patterns. The approach also allows comparison of courts based on recovery rates.
11.2 YIP Human In the Loop Statistical RelationalLearners
2017-10-23
learning formalisms including inverse reinforcement learning [4] and statistical relational learning [7, 5, 8]. We have also applied our algorithms in...one introduced for label preferences. 4 Figure 2: Active Advice Seeking for Inverse Reinforcement Learning. active advice seeking is in selecting the...learning tasks. 1.2.1 Sequential Decision-Making Our previous work on advice for inverse reinforcement learning (IRL) defined advice as action
Formal methods for dependable real-time systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
Vasconcelos, Hemerson Bruno da Silva; Woods, David John
2017-01-01
This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. Methods: A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Results: Pharmacists had 1–4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). Conclusion: These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools. PMID:29272292
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies
Tang, Li
2014-01-01
Summary An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this paper, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, e.g., subjects with mental disorders or neurodegenerative diseases such as Parkinson’s as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation. PMID:24033125
NASA Astrophysics Data System (ADS)
Zheng, Feifei; Maier, Holger R.; Wu, Wenyan; Dandy, Graeme C.; Gupta, Hoshin V.; Zhang, Tuqiao
2018-02-01
Hydrological models are used for a wide variety of engineering purposes, including streamflow forecasting and flood-risk estimation. To develop such models, it is common to allocate the available data to calibration and evaluation data subsets. Surprisingly, the issue of how this allocation can affect model evaluation performance has been largely ignored in the research literature. This paper discusses the evaluation performance bias that can arise from how available data are allocated to calibration and evaluation subsets. As a first step to assessing this issue in a statistically rigorous fashion, we present a comprehensive investigation of the influence of data allocation on the development of data-driven artificial neural network (ANN) models of streamflow. Four well-known formal data splitting methods are applied to 754 catchments from Australia and the U.S. to develop 902,483 ANN models. Results clearly show that the choice of the method used for data allocation has a significant impact on model performance, particularly for runoff data that are more highly skewed, highlighting the importance of considering the impact of data splitting when developing hydrological models. The statistical behavior of the data splitting methods investigated is discussed and guidance is offered on the selection of the most appropriate data splitting methods to achieve representative evaluation performance for streamflow data with different statistical properties. Although our results are obtained for data-driven models, they highlight the fact that this issue is likely to have a significant impact on all types of hydrological models, especially conceptual rainfall-runoff models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messud, J.; Dinh, P. M.; Suraud, Eric
2009-10-15
We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent 'generalized SIC-OEP'. A straightforward approximation, using the spatial localization of one set of orbitals, leads to the 'generalized SIC-Slater' formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.
NASA Astrophysics Data System (ADS)
Messud, J.; Dinh, P. M.; Reinhard, P.-G.; Suraud, Eric
2009-10-01
We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent “generalized SIC-OEP.” A straightforward approximation, using the spatial localization of one set of orbitals, leads to the “generalized SIC-Slater” formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.
Cost Implications of Organizing Nursing Home Workforce in Teams
Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena
2009-01-01
Objective To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Data Sources/Study Setting Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. Study Design A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Data Collection Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Principal Findings Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Conclusions Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs. PMID:19486181
Verification of NASA Emergent Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.
NASA Astrophysics Data System (ADS)
Kalthoff, Mona; Keim, Frederik; Krull, Holger; Uhrig, Götz S.
2017-05-01
The density matrix formalism and the equation of motion approach are two semi-analytical methods that can be used to compute the non-equilibrium dynamics of correlated systems. While for a bilinear Hamiltonian both formalisms yield the exact result, for any non-bilinear Hamiltonian a truncation is necessary. Due to the fact that the commonly used truncation schemes differ for these two methods, the accuracy of the obtained results depends significantly on the chosen approach. In this paper, both formalisms are applied to the quantum Rabi model. This allows us to compare the approximate results and the exact dynamics of the system and enables us to discuss the accuracy of the approximations as well as the advantages and the disadvantages of both methods. It is shown to which extent the results fulfill physical requirements for the observables and which properties of the methods lead to unphysical results.
The average receiver operating characteristic curve in multireader multicase imaging studies
Samuelson, F W
2014-01-01
Objective: In multireader, multicase (MRMC) receiver operating characteristic (ROC) studies for evaluating medical imaging systems, the area under the ROC curve (AUC) is often used as a summary metric. Owing to the limitations of AUC, plotting the average ROC curve to accompany the rigorous statistical inference on AUC is recommended. The objective of this article is to investigate methods for generating the average ROC curve from ROC curves of individual readers. Methods: We present both a non-parametric method and a parametric method for averaging ROC curves that produce a ROC curve, the area under which is equal to the average AUC of individual readers (a property we call area preserving). We use hypothetical examples, simulated data and a real-world imaging data set to illustrate these methods and their properties. Results: We show that our proposed methods are area preserving. We also show that the method of averaging the ROC parameters, either the conventional bi-normal parameters (a, b) or the proper bi-normal parameters (c, da), is generally not area preserving and may produce a ROC curve that is intuitively not an average of multiple curves. Conclusion: Our proposed methods are useful for making plots of average ROC curves in MRMC studies as a companion to the rigorous statistical inference on the AUC end point. The software implementing these methods is freely available from the authors. Advances in knowledge: Methods for generating the average ROC curve in MRMC ROC studies are formally investigated. The area-preserving criterion we defined is useful to evaluate such methods. PMID:24884728
Thomas-Fermi model for a bulk self-gravitating stellar object in two dimensions
NASA Astrophysics Data System (ADS)
De, Sanchari; Chakrabarty, Somenath
2015-09-01
In this article we have solved a hypothetical problem related to the stability and gross properties of two-dimensional self-gravitating stellar objects using the Thomas-Fermi model. The formalism presented here is an extension of the standard three-dimensional problem discussed in the book on statistical physics, Part I by Landau and Lifshitz. Further, the formalism presented in this article may be considered a class problem for post-graduate-level students of physics or may be assigned as a part of their dissertation project.
2017-04-17
Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed
Dominant partition method. [based on a wave function formalism
NASA Technical Reports Server (NTRS)
Dixon, R. M.; Redish, E. F.
1979-01-01
By use of the L'Huillier, Redish, and Tandy (LRT) wave function formalism, a partially connected method, the dominant partition method (DPM) is developed for obtaining few body reductions of the many body problem in the LRT and Bencze, Redish, and Sloan (BRS) formalisms. The DPM maps the many body problem to a fewer body one by using the criterion that the truncated formalism must be such that consistency with the full Schroedinger equation is preserved. The DPM is based on a class of new forms for the irreducible cluster potential, which is introduced in the LRT formalism. Connectivity is maintained with respect to all partitions containing a given partition, which is referred to as the dominant partition. Degrees of freedom corresponding to the breakup of one or more of the clusters of the dominant partition are treated in a disconnected manner. This approach for simplifying the complicated BRS equations is appropriate for physical problems where a few body reaction mechanism prevails.
Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.
Festing, M F
2001-01-01
In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.
Unified quantitative characterization of epithelial tissue development
Guirao, Boris; Rigaud, Stéphane U; Bosveld, Floris; Bailles, Anaïs; López-Gay, Jesús; Ishihara, Shuji; Sugimura, Kaoru
2015-01-01
Understanding the mechanisms regulating development requires a quantitative characterization of cell divisions, rearrangements, cell size and shape changes, and apoptoses. We developed a multiscale formalism that relates the characterizations of each cell process to tissue growth and morphogenesis. Having validated the formalism on computer simulations, we quantified separately all morphogenetic events in the Drosophila dorsal thorax and wing pupal epithelia to obtain comprehensive statistical maps linking cell and tissue scale dynamics. While globally cell shape changes, rearrangements and divisions all significantly participate in tissue morphogenesis, locally, their relative participations display major variations in space and time. By blocking division we analyzed the impact of division on rearrangements, cell shape changes and tissue morphogenesis. Finally, by combining the formalism with mechanical stress measurement, we evidenced unexpected interplays between patterns of tissue elongation, cell division and stress. Our formalism provides a novel and rigorous approach to uncover mechanisms governing tissue development. DOI: http://dx.doi.org/10.7554/eLife.08519.001 PMID:26653285
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro
2012-06-01
We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Systems, methods and apparatus for pattern matching in procedure development and verification
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.
Plastino, A; Rocca, M C
2017-06-01
Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.
Equitability, mutual information, and the maximal information coefficient.
Kinney, Justin B; Atwal, Gurinder S
2014-03-04
How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical "equitability" has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a self-consistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518-1524], which proposed an alternative definition of equitability and introduced a new statistic, the "maximal information coefficient" (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pant, Nidhi; Das, Santanu; Mitra, Sanjit
Mild, unavoidable deviations from circular-symmetry of instrumental beams along with scan strategy can give rise to measurable Statistical Isotropy (SI) violation in Cosmic Microwave Background (CMB) experiments. If not accounted properly, this spurious signal can complicate the extraction of other SI violation signals (if any) in the data. However, estimation of this effect through exact numerical simulation is computationally intensive and time consuming. A generalized analytical formalism not only provides a quick way of estimating this signal, but also gives a detailed understanding connecting the leading beam anisotropy components to a measurable BipoSH characterisation of SI violation. In this paper,more » we provide an approximate generic analytical method for estimating the SI violation generated due to a non-circular (NC) beam and arbitrary scan strategy, in terms of the Bipolar Spherical Harmonic (BipoSH) spectra. Our analytical method can predict almost all the features introduced by a NC beam in a complex scan and thus reduces the need for extensive numerical simulation worth tens of thousands of CPU hours into minutes long calculations. As an illustrative example, we use WMAP beams and scanning strategy to demonstrate the easability, usability and efficiency of our method. We test all our analytical results against that from exact numerical simulations.« less
NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Munoz, Cesar A.
2008-01-01
This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation under all possible scenarios. Current research has assumed perfect knowledge of the location of other aircraft in the vicinity so absolute guarantees are possible, but increasingly we are relaxing the assumptions to allow incomplete, inaccurate, and/or faulty information from communication sources.
A study on facial expressions recognition
NASA Astrophysics Data System (ADS)
Xu, Jingjing
2017-09-01
In terms of communication, postures and facial expressions of such feelings like happiness, anger and sadness play important roles in conveying information. With the development of the technology, recently a number of algorithms dealing with face alignment, face landmark detection, classification, facial landmark localization and pose estimation have been put forward. However, there are a lot of challenges and problems need to be fixed. In this paper, a few technologies have been concluded and analyzed, and they all relate to handling facial expressions recognition and poses like pose-indexed based multi-view method for face alignment, robust facial landmark detection under significant head pose and occlusion, partitioning the input domain for classification, robust statistics face formalization.
Application of latent variable model in Rosenberg self-esteem scale.
Leung, Shing-On; Wu, Hui-Ping
2013-01-01
Latent Variable Models (LVM) are applied to Rosenberg Self-Esteem Scale (RSES). Parameter estimations automatically give negative signs hence no recoding is necessary for negatively scored items. Bad items can be located through parameter estimate, item characteristic curves and other measures. Two factors are extracted with one on self-esteem and the other on the degree to take moderate views, with the later not often being covered in previous studies. A goodness-of-fit measure based on two-way margins is used but more works are needed. Results show that scaling provided by models with more formal statistical ground correlated highly with conventional method, which may provide justification for usual practice.
The uniform quantized electron gas revisited
NASA Astrophysics Data System (ADS)
Lomba, Enrique; Høye, Johan S.
2017-11-01
In this article we continue and extend our recent work on the correlation energy of the quantized electron gas of uniform density at temperature T=0 . As before, we utilize the methods, properties, and results obtained by means of classical statistical mechanics. These were extended to quantized systems via the Feynman path integral formalism. The latter translates the quantum problem into a classical polymer problem in four dimensions. Again, the well known RPA (random phase approximation) is recovered as a basic result which we then modify and improve upon. Here we analyze the condition of thermodynamic self-consistency. Our numerical calculations exhibit a remarkable agreement with well known results of a standard parameterization of Monte Carlo correlation energies.
Guevara-García, José Antonio; Montiel-Corona, Virginia
2012-03-01
A statistical analysis of a used battery collection campaign in the state of Tlaxcala, Mexico, is presented. This included a study of the metal composition of spent batteries from formal and informal markets, and a critical discussion about the management of spent batteries in Mexico with respect to legislation. A six-month collection campaign was statistically analyzed: 77% of the battery types were "AA" and 30% of the batteries were from the informal market. A substantial percentage (36%) of batteries had residual voltage in the range 1.2-1.4 V, and 70% had more than 1.0 V; this may reflect underutilization. Metal content analysis and recovery experiments were performed with the five formal and four more frequent informal trademarks. The analysis of Hg, Cd and Pb showed there is no significant difference in content between formal and informal commercialized batteries. All of the analyzed trademarks were under the permissible limit levels of the proposed Mexican Official Norm (NOM) NMX-AA-104-SCFI-2006 and would be classified as not dangerous residues (can be thrown to the domestic rubbish); however, compared with the EU directive 2006/66/EC, 8 out of 9 of the selected battery trademarks would be rejected, since the Mexican Norm content limit is 20, 7.5 and 5 fold higher in Hg, Cd and Pb, respectively, than the EU directive. These results outline the necessity for better regulatory criteria in the proposed Mexican NOM in order to minimize the impact on human health and the environment of this type of residues. Copyright © 2010 Elsevier Ltd. All rights reserved.
Quantum Statistics of the Toda Oscillator in the Wigner Function Formalism
NASA Astrophysics Data System (ADS)
Vojta, Günter; Vojta, Matthias
Classical and quantum mechanical Toda systems (Toda molecules, Toda lattices, Toda quantum fields) recently found growing interest as nonlinear systems showing solitons and chaos. In this paper the statistical thermodynamics of a system of quantum mechanical Toda oscillators characterized by a potential energy V(q) = Vo cos h q is treated within the Wigner function formalism (phase space formalism of quantum statistics). The partition function is given as a Wigner- Kirkwood series expansion in terms of powers of h2 (semiclassical expansion). The partition function and all thermodynamic functions are written, with considerable exactness, as simple closed expressions containing only the modified Hankel functions Ko and K1 of the purely imaginary argument i with = Vo/kT.
Caricato, Marco
2013-07-28
The calculation of vertical electronic transition energies of molecular systems in solution with accurate quantum mechanical methods requires the use of approximate and yet reliable models to describe the effect of the solvent on the electronic structure of the solute. The polarizable continuum model (PCM) of solvation represents a computationally efficient way to describe this effect, especially when combined with coupled cluster (CC) methods. Two formalisms are available to compute transition energies within the PCM framework: State-Specific (SS) and Linear-Response (LR). The former provides a more complete account of the solute-solvent polarization in the excited states, while the latter is computationally very efficient (i.e., comparable to gas phase) and transition properties are well defined. In this work, I review the theory for the two formalisms within CC theory with a focus on their computational requirements, and present the first implementation of the LR-PCM formalism with the coupled cluster singles and doubles method (CCSD). Transition energies computed with LR- and SS-CCSD-PCM are presented, as well as a comparison between solvation models in the LR approach. The numerical results show that the two formalisms provide different absolute values of transition energy, but similar relative solvatochromic shifts (from nonpolar to polar solvents). The LR formalism may then be used to explore the solvent effect on multiple states and evaluate transition probabilities, while the SS formalism may be used to refine the description of specific states and for the exploration of excited state potential energy surfaces of solvated systems.
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
Lfm2000: Fifth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
2000-01-01
This is the proceedings of Lfm2000: Fifth NASA Langley Formal Methods Workshop. The workshop was held June 13-15, 2000, in Williamsburg, Virginia. See the web site
Shi, Ran; Guo, Ying
2016-12-01
Human brains perform tasks via complex functional networks consisting of separated brain regions. A popular approach to characterize brain functional networks in fMRI studies is independent component analysis (ICA), which is a powerful method to reconstruct latent source signals from their linear mixtures. In many fMRI studies, an important goal is to investigate how brain functional networks change according to specific clinical and demographic variabilities. Existing ICA methods, however, cannot directly incorporate covariate effects in ICA decomposition. Heuristic post-ICA analysis to address this need can be inaccurate and inefficient. In this paper, we propose a hierarchical covariate-adjusted ICA (hc-ICA) model that provides a formal statistical framework for estimating covariate effects and testing differences between brain functional networks. Our method provides a more reliable and powerful statistical tool for evaluating group differences in brain functional networks while appropriately controlling for potential confounding factors. We present an analytically tractable EM algorithm to obtain maximum likelihood estimates of our model. We also develop a subspace-based approximate EM that runs significantly faster while retaining high accuracy. To test the differences in functional networks, we introduce a voxel-wise approximate inference procedure which eliminates the need of computationally expensive covariance matrix estimation and inversion. We demonstrate the advantages of our methods over the existing method via simulation studies. We apply our method to an fMRI study to investigate differences in brain functional networks associated with post-traumatic stress disorder (PTSD).
The Hierarchical Structure of Formal Operational Tasks.
ERIC Educational Resources Information Center
Bart, William M.; Mertens, Donna M.
1979-01-01
The hierarchical structure of the formal operational period of Piaget's theory of cognitive development was explored through the application of ordering theoretical methods to a set of data that systematically utilized the various formal operational schemes. Results suggested a common structure underlying task performance. (Author/BH)
NASA Technical Reports Server (NTRS)
Farrell, W. M.; Hurley, D. M.; Esposito, V. J.; Mclain, J. L.; Zimmerman, M. I.
2017-01-01
We present a new formalism to describe the outgassing of hydrogen initially implanted by the solar wind protons into exposed soils on airless bodies. The formalism applies a statistical mechanics approach similar to that applied recently to molecular adsorption onto activated surfaces. The key element enabling this formalism is the recognition that the interatomic potential between the implanted H and regolith-residing oxides is not of singular value but possess a distribution of trapped energy values at a given temperature, F(U,T). All subsequent derivations of the outward diffusion and H retention rely on the specific properties of this distribution. We find that solar wind hydrogen can be retained if there are sites in the implantation layer with activation energy values exceeding 0.5eV. We especially examine the dependence of H retention applying characteristic energy values found previously for irradiated silica and mature lunar samples. We also apply the formalism to two cases that differ from the typical solar wind implantation at the Moon. First, we test for a case of implantation in magnetic anomaly regions where significantly lower-energy ions of solar wind origin are expected to be incident with the surface. In magnetic anomalies, H retention is found to be reduced due to the reduced ion flux and shallower depth of implantation. Second, we also apply the model to Phobos where the surface temperature range is not as extreme as the Moon. We find the H atom retention in this second case is higher than the lunar case due to the reduced thermal extremes (that reduces outgassing).
What Influences Mental Illness? Discrepancies Between Medical Education and Conception.
Einstein, Evan Hy; Klepacz, Lidia
2017-01-01
This preliminary study examined the differences between what was taught during a formal medical education and medical students' and psychiatry residents' conceptions of notions regarding the causes and determinants of mental illness. The authors surveyed 74 medical students and 11 residents via convenience sampling. The survey contained 18 statements which were rated twice based on truthfulness in terms of a participant's formal education and conception, respectively. Descriptive statistics and a Wilcoxon signed rank test determined differences between education and conception. Results showed that students were less likely to perceive a neurotransmitter imbalance to cause mental illness, as opposed to what was emphasized during a formal medical education. Students and residents also understood the importance of factors such as systemic racism and socioeconomic status in the development of mental illness, which were factors that did not receive heavy emphasis during medical education. Furthermore, students and residents believed that not only did mental illnesses have nonuniform pathologies, but that the Diagnostic and Statistical Manual of Mental Disorders also had the propensity to sometimes arbitrarily categorize individuals with potentially negative consequences. If these notions are therefore part of students' and residents' conceptions, as well as documented in the literature, then it seems appropriate for medical education to be further developed to emphasize these ideas.
A Formal Semantics for the WS-BPEL Recovery Framework
NASA Astrophysics Data System (ADS)
Dragoni, Nicola; Mazzara, Manuel
While current studies on Web services composition are mostly focused - from the technical viewpoint - on standards and protocols, this work investigates the adoption of formal methods for dependable composition. The Web Services Business Process Execution Language (WS-BPEL) - an OASIS standard widely adopted both in academic and industrial environments - is considered as a touchstone for concrete composition languages and an analysis of its ambiguous Recovery Framework specification is offered. In order to show the use of formal methods, a precise and unambiguous description of its (simplified) mechanisms is provided by means of a conservative extension of the π-calculus. This has to be intended as a well known case study providing methodological arguments for the adoption of formal methods in software specification. The aspect of verification is not the main topic of the paper but some hints are given.
Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas
2017-06-07
A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.
NASA Astrophysics Data System (ADS)
Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas
2017-06-01
A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.
NASA Astrophysics Data System (ADS)
Zakaria, Chahnez; Curé, Olivier; Salzano, Gabriella; Smaïli, Kamel
In Computer Supported Cooperative Work (CSCW), it is crucial for project leaders to detect conflicting situations as early as possible. Generally, this task is performed manually by studying a set of documents exchanged between team members. In this paper, we propose a full-fledged automatic solution that identifies documents, subjects and actors involved in relational conflicts. Our approach detects conflicts in emails, probably the most popular type of documents in CSCW, but the methods used can handle other text-based documents. These methods rely on the combination of statistical and ontological operations. The proposed solution is decomposed in several steps: (i) we enrich a simple negative emotion ontology with terms occuring in the corpus of emails, (ii) we categorize each conflicting email according to the concepts of this ontology and (iii) we identify emails, subjects and team members involved in conflicting emails using possibilistic description logic and a set of proposed measures. Each of these steps are evaluated and validated on concrete examples. Moreover, this approach's framework is generic and can be easily adapted to domains other than conflicts, e.g. security issues, and extended with operations making use of our proposed set of measures.
Towards an Obesity-Cancer Knowledge Base: Biomedical Entity Identification and Relation Detection
Lossio-Ventura, Juan Antonio; Hogan, William; Modave, François; Hicks, Amanda; Hanna, Josh; Guo, Yi; He, Zhe; Bian, Jiang
2017-01-01
Obesity is associated with increased risks of various types of cancer, as well as a wide range of other chronic diseases. On the other hand, access to health information activates patient participation, and improve their health outcomes. However, existing online information on obesity and its relationship to cancer is heterogeneous ranging from pre-clinical models and case studies to mere hypothesis-based scientific arguments. A formal knowledge representation (i.e., a semantic knowledge base) would help better organizing and delivering quality health information related to obesity and cancer that consumers need. Nevertheless, current ontologies describing obesity, cancer and related entities are not designed to guide automatic knowledge base construction from heterogeneous information sources. Thus, in this paper, we present methods for named-entity recognition (NER) to extract biomedical entities from scholarly articles and for detecting if two biomedical entities are related, with the long term goal of building a obesity-cancer knowledge base. We leverage both linguistic and statistical approaches in the NER task, which supersedes the state-of-the-art results. Further, based on statistical features extracted from the sentences, our method for relation detection obtains an accuracy of 99.3% and a f-measure of 0.993. PMID:28503356
Non-ad-hoc decision rule for the Dempster-Shafer method of evidential reasoning
NASA Astrophysics Data System (ADS)
Cheaito, Ali; Lecours, Michael; Bosse, Eloi
1998-03-01
This paper is concerned with the fusion of identity information through the use of statistical analysis rooted in Dempster-Shafer theory of evidence to provide automatic identification aboard a platform. An identity information process for a baseline Multi-Source Data Fusion (MSDF) system is defined. The MSDF system is applied to information sources which include a number of radars, IFF systems, an ESM system, and a remote track source. We use a comprehensive Platform Data Base (PDB) containing all the possible identity values that the potential target may take, and we use the fuzzy logic strategies which enable the fusion of subjective attribute information from sensor and the PDB to make the derivation of target identity more quickly, more precisely, and with statistically quantifiable measures of confidence. The conventional Dempster-Shafer lacks a formal basis upon which decision can be made in the face of ambiguity. We define a non-ad hoc decision rule based on the expected utility interval for pruning the `unessential' propositions which would otherwise overload the real-time data fusion systems. An example has been selected to demonstrate the implementation of our modified Dempster-Shafer method of evidential reasoning.
Automatically Grading Customer Confidence in a Formal Specification.
ERIC Educational Resources Information Center
Shukur, Zarina; Burke, Edmund; Foxley, Eric
1999-01-01
Describes an automatic grading system for a formal methods computer science course that is able to evaluate a formal specification written in the Z language. Quality is measured by considering first, specification correctness (syntax, semantics, and satisfaction of customer requirements), and second, specification maintainability (comparison of…
Formal Solutions for Polarized Radiative Transfer. II. High-order Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch
When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.
Formalizing New Navigation Requirements for NASA's Space Shuttle
NASA Technical Reports Server (NTRS)
DiVito, Ben L.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Adaptive clinical trial design.
Chow, Shein-Chung
2014-01-01
In recent years, the use of adaptive design methods in clinical trials based on accumulated data at interim has received much attention because of its flexibility and efficiency in pharmaceutical/clinical development. In practice, adaptive design may provide the investigators a second chance to modify or redesign the trial while the study is still ongoing. However, it is a concern that a shift in target patient population may occur after significant adaptations are made. In addition, the overall type I error rate may not be preserved. Moreover, the results may not be reliable and hence are difficult to interpret. As indicated by the US Food and Drug Administration draft guidance on adaptive design clinical trials, the adaptive design has to be a prospectively planned opportunity and should be based on information collected within the study, with or without formal statistical hypothesis testing. This article reviews the relative advantages, limitations, and feasibility of commonly considered adaptive designs in clinical trials. Statistical concerns when implementing adaptive designs are also discussed.
Ball, Robert; Horne, Dale; Izurieta, Hector; Sutherland, Andrea; Walderhaug, Mark; Hsu, Henry
2011-05-01
The public health community faces increasing demands for improving vaccine safety while simultaneously increasing the number of vaccines available to prevent infectious diseases. The passage of the US Food and Drug Administration (FDA) Amendment Act of 2007 formalized the concept of life-cycle management of the risks and benefits of vaccines, from early clinical development through many years of use in large numbers of people. Harnessing scientific and technologic advances is necessary to improve vaccine-safety evaluation. The Office of Biostatistics and Epidemiology in the Center for Biologics Evaluation and Research is working to improve the FDA's ability to monitor vaccine safety by improving statistical, epidemiologic, and risk-assessment methods, gaining access to new sources of data, and exploring the use of genomics data. In this article we describe the current approaches, new resources, and future directions that the FDA is taking to improve the evaluation of vaccine safety.
Job Search Methods: Consequences for Gender-based Earnings Inequality.
ERIC Educational Resources Information Center
Huffman, Matt L.; Torres, Lisa
2001-01-01
Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1986-01-01
A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.
Deep first formal concept search.
Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu
2014-01-01
The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
Formalisms for user interface specification and design
NASA Technical Reports Server (NTRS)
Auernheimer, Brent J.
1989-01-01
The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.
A STUDY OF FORMALLY ADVERTISED PROCUREMENT
As a method of procuring goods and services, formally advertised procurement offers a number of advantages. These include the prevention of fraud and...two-thirds of all contracts are let in these cases. This is done by examining over 2,300 contracts let under formal advertising procedures. A measure of
Geometry and Formal Linguistics.
ERIC Educational Resources Information Center
Huff, George A.
This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…
Immersive Theater - a Proven Way to Enhance Learning Retention
NASA Astrophysics Data System (ADS)
Reiff, P. H.; Zimmerman, L.; Spillane, S.; Sumners, C.
2014-12-01
The portable immersive theater has gone from our first demonstration at fall AGU 2003 to a product offered by multiple companies in various versions to literally millions of users per year. As part of our NASA funded outreach program, we conducted a test of learning in a portable Discovery Dome as contrasted with learning the same materials (visuals and sound track) on a computer screen. We tested 200 middle school students (primarily underserved minorities). Paired t-tests and an independent t-test were used to compare the amount of learning that students achieved. Interest questionnaires were administered to participants in formal (public school) settings and focus groups were conducted in informal (museum camp and educational festival) settings. Overall results from the informal and formal educational setting indicated that there was a statistically significant increase in test scores after viewing We Choose Space. There was a statistically significant increase in test scores for students who viewed We Choose Space in the portable Discovery Dome (9.75) as well as with the computer (8.88). However, long-term retention of the material tested on the questionnaire indicated that for students who watched We Choose Space in the portable Discovery Dome, there was a statistically significant long-term increase in test scores (10.47), whereas, six weeks after learning on the computer, the improvements over the initial baseline (3.49) were far less and were not statistically significant. The test score improvement six weeks after learning in the dome was essentially the same as the post test immediately after watching the show, demonstrating virtually no loss of gained information in the six week interval. In the formal educational setting, approximately 34% of the respondents indicated that they wanted to learn more about becoming a scientist, while 35% expressed an interest in a career in space science. In the informal setting, 26% indicated that they were interested in pursuing a career in space science.
Assessment and statistics of surgically induced astigmatism.
Naeser, Kristian
2008-05-01
The aim of the thesis was to develop methods for assessment of surgically induced astigmatism (SIA) in individual eyes, and in groups of eyes. The thesis is based on 12 peer-reviewed publications, published over a period of 16 years. In these publications older and contemporary literature was reviewed(1). A new method (the polar system) for analysis of SIA was developed. Multivariate statistical analysis of refractive data was described(2-4). Clinical validation studies were performed. The description of a cylinder surface with polar values and differential geometry was compared. The main results were: refractive data in the form of sphere, cylinder and axis may define an individual patient or data set, but are unsuited for mathematical and statistical analyses(1). The polar value system converts net astigmatisms to orthonormal components in dioptric space. A polar value is the difference in meridional power between two orthogonal meridians(5,6). Any pair of polar values, separated by an arch of 45 degrees, characterizes a net astigmatism completely(7). The two polar values represent the net curvital and net torsional power over the chosen meridian(8). The spherical component is described by the spherical equivalent power. Several clinical studies demonstrated the efficiency of multivariate statistical analysis of refractive data(4,9-11). Polar values and formal differential geometry describe astigmatic surfaces with similar concepts and mathematical functions(8). Other contemporary methods, such as Long's power matrix, Holladay's and Alpins' methods, Zernike(12) and Fourier analyses(8), are correlated to the polar value system. In conclusion, analysis of SIA should be performed with polar values or other contemporary component systems. The study was supported by Statens Sundhedsvidenskabeligt Forskningsråd, Cykelhandler P. Th. Rasmussen og Hustrus Mindelegat, Hotelejer Carl Larsen og Hustru Nicoline Larsens Mindelegat, Landsforeningen til Vaern om Synet, Forskningsinitiativet for Arhus Amt, Alcon Denmark, and Desirée and Niels Ydes Fond.
NASA Astrophysics Data System (ADS)
Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.
2018-04-01
The Klein-Gordon equation is extended in the presence of an Aharonov-Bohm magnetic field for the Cornell potential and the corresponding wave functions as well as the spectra are obtained. After introducing the superstatistics in the statistical mechanics, we first derived the effective Boltzmann factor in the deformed formalism with modified Dirac delta distribution. We then use the concepts of the superstatistics to calculate the thermodynamics properties of the system. The well-known results are recovered by the vanishing of deformation parameter and some graphs are plotted for the clarity of our results.
Shen, Li; Saykin, Andrew J.; Williams, Scott M.; Moore, Jason H.
2016-01-01
ABSTRACT Although gene‐environment (G× E) interactions play an important role in many biological systems, detecting these interactions within genome‐wide data can be challenging due to the loss in statistical power incurred by multiple hypothesis correction. To address the challenge of poor power and the limitations of existing multistage methods, we recently developed a screening‐testing approach for G× E interaction detection that combines elastic net penalized regression with joint estimation to support a single omnibus test for the presence of G× E interactions. In our original work on this technique, however, we did not assess type I error control or power and evaluated the method using just a single, small bladder cancer data set. In this paper, we extend the original method in two important directions and provide a more rigorous performance evaluation. First, we introduce a hierarchical false discovery rate approach to formally assess the significance of individual G× E interactions. Second, to support the analysis of truly genome‐wide data sets, we incorporate a score statistic‐based prescreening step to reduce the number of single nucleotide polymorphisms prior to fitting the first stage penalized regression model. To assess the statistical properties of our method, we compare the type I error rate and statistical power of our approach with competing techniques using both simple simulation designs as well as designs based on real disease architectures. Finally, we demonstrate the ability of our approach to identify biologically plausible SNP‐education interactions relative to Alzheimer's disease status using genome‐wide association study data from the Alzheimer's Disease Neuroimaging Initiative (ADNI). PMID:27578615
Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.
León, Larry F; Cai, Tianxi
2012-04-01
In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.
Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786
Considerations in the statistical analysis of clinical trials in periodontitis.
Imrey, P B
1986-05-01
Adult periodontitis has been described as a chronic infectious process exhibiting sporadic, acute exacerbations which cause quantal, localized losses of dental attachment. Many analytic problems of periodontal trials are similar to those of other chronic diseases. However, the episodic, localized, infrequent, and relatively unpredictable behavior of exacerbations, coupled with measurement error difficulties, cause some specific problems. Considerable controversy exists as to the proper selection and treatment of multiple site data from the same patient for group comparisons for epidemiologic or therapeutic evaluative purposes. This paper comments, with varying degrees of emphasis, on several issues pertinent to the analysis of periodontal trials. Considerable attention is given to the ways in which measurement variability may distort analytic results. Statistical treatments of multiple site data for descriptive summaries are distinguished from treatments for formal statistical inference to validate therapeutic effects. Evidence suggesting that sites behave independently is contested. For inferential analyses directed at therapeutic or preventive effects, analytic models based on site independence are deemed unsatisfactory. Methods of summarization that may yield more powerful analyses than all-site mean scores, while retaining appropriate treatment of inter-site associations, are suggested. Brief comments and opinions on an assortment of other issues in clinical trial analysis are preferred.
Stopping power of dense plasmas: The collisional method and limitations of the dielectric formalism.
Clauser, C F; Arista, N R
2018-02-01
We present a study of the stopping power of plasmas using two main approaches: the collisional (scattering theory) and the dielectric formalisms. In the former case, we use a semiclassical method based on quantum scattering theory. In the latter case, we use the full description given by the extension of the Lindhard dielectric function for plasmas of all degeneracies. We compare these two theories and show that the dielectric formalism has limitations when it is used for slow heavy ions or atoms in dense plasmas. We present a study of these limitations and show the regimes where the dielectric formalism can be used, with appropriate corrections to include the usual quantum and classical limits. On the other hand, the semiclassical method shows the correct behavior for all plasma conditions and projectile velocity and charge. We consider different models for the ion charge distributions, including bare and dressed ions as well as neutral atoms.
Russell, Richard A; Adams, Niall M; Stephens, David A; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S
2009-04-22
Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments.
Russell, Richard A.; Adams, Niall M.; Stephens, David A.; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S.
2009-01-01
Abstract Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments. PMID:19383481
ERIC Educational Resources Information Center
Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost
2003-01-01
Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)
Teaching medical students ultrasound-guided vascular access - which learning method is best?
Lian, Alwin; Rippey, James C R; Carr, Peter J
2017-05-15
Ultrasound is recommended to guide insertion of peripheral intravenous vascular cannulae (PIVC) where difficulty is experienced. Ultrasound machines are now common-place and junior doctors are often expected to be able to use them. The educational standards for this skill are highly varied, ranging from no education, to self-guided internet-based education, to formal, face-to-face traditional education. In an attempt to decide which educational technique our institution should introduce, a small pilot trial comparing educational techniques was designed. Thirty medical students were enrolled and allocated to one of three groups. PIVC placing ability was then observed, tested and graded on vascular access phantoms. The formal, face-to-face traditional education was rated best by the students, and had the highest success rate in PIVC placement, the improvement statistically significant compared to no education (p = 0.01) and trending towards significance when compared to self-directed internet-based education (p<0.06). The group receiving traditional face-to-face teaching on ultrasound-guided vascular access, performed significantly better than those not receiving education. As the number of ultrasound machines in clinical areas increases, it is important that education programs to support their safe and appropriate use are developed.
The Epidemiology and Associated Phenomenology of Formal Thought Disorder: A Systematic Review
Roche, Eric; Creed, Lisa; MacMahon, Donagh; Brennan, Daria; Clarke, Mary
2015-01-01
Background: Authors of the Diagnostic and Statistical Manual, Fifth Edition (DSM-V) have recommended to “integrate dimensions into clinical practice.” The epidemiology and associated phenomenology of formal thought disorder (FTD) have been described but not reviewed. We aimed to carry out a systematic review of FTD to this end. Methods: A systematic review of FTD literature, from 1978 to 2013, using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Results: A total of 881 abstracts were reviewed and 120 articles met inclusion criteria; articles describing FTD factor structure (n = 15), prevalence and longitudinal course (n = 41), role in diagnosis (n = 22), associated clinical variables (n = 56), and influence on outcome (n = 35) were included. Prevalence estimates for FTD in psychosis range from 5% to 91%. Dividing FTD into domains, by factor analysis, can accurately identify 91% of psychotic diagnoses. FTD is associated with increased clinical severity. Poorer outcomes are predicted by negative thought disorder, more so than the typical construct of “disorganized speech.” Conclusion: FTD is a common symptom of psychosis and may be considered a marker of illness severity. Detailed dimensional assessment of FTD can clarify diagnosis and may help predict prognosis. PMID:25180313
The influence of education on performance of adults on the Clock Drawing Test
de Noronha, Ísis Franci Cavalcanti; Barreto, Simone dos Santos; Ortiz, Karin Zazo
2018-01-01
ABSTRACT The Clock Drawing Test (CDT) is an important instrument for screening individuals suspected of having cognitive impairment. Objective: To determine the influence of education on the performance of healthy adults on the CDT. Methods: A total of 121 drawings by healthy adults without neurological complaints or impairments were analysed. Participants were stratified by educational level into 4 subgroups: 27 illiterate adults, 34 individuals with 1-4 years of formal education, 30 with 5-11 years, and 30 adults with >11 years' formal education. Scores on the CDT were analyzed based on a scale of 1-10 points according to the criteria of Sunderland et al. (1989).¹ The Kruskal-Wallis test was applied to compare the different education groups. Tukey's multiple comparisons test was used when a significant factor was found. Results: Although scores were higher with greater education, statistically significant differences on the CDT were found only between the illiterate and other educated groups. Conclusion: The CDT proved especially difficult for illiterate individuals, who had lower scores. These results suggest that this screening test is suitable for assessing mainly visuoconstructional praxis and providing an overall impression of cognitive function among individuals, independently of years of education. PMID:29682235
Tiger in the fault tree jungle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, P.
1976-01-01
There is yet little evidence of serious efforts to apply formal reliability analysis methods to evaluate, or even to identify, potential common-mode failures (CMF) of reactor safeguard systems. The prospects for event logic modeling in this regard are examined by the primitive device of reviewing actual CMF experience in terms of what the analyst might have perceived a priori. Further insights of the probability and risks aspects of CMFs are sought through consideration of three key likelihood factors: (1) prior probability of cause ever existing, (2) opportunities for removing cause, and (3) probability that a CMF cause will be activatedmore » by conditions associated with a real system challenge. It was concluded that the principal needs for formal logical discipline in the endeavor to decrease CMF-related risks are to discover and to account for strong ''energetic'' dependency couplings that could arise in the major accidents usually classed as ''hypothetical.'' This application would help focus research, design and quality assurance efforts to cope with major CMF causes. But without extraordinary challenges to the reactor safeguard systems, there must continue to be virtually no statistical evidence pertinent to that class of failure dependencies.« less
Workplace Skills Taught in a Simulated Analytical Department
NASA Astrophysics Data System (ADS)
Sonchik Marine, Susan
2001-11-01
Integration of workplace skills into the academic setting is paramount for any chemical technology program. In addition to the expected chemistry content, courses must build proficiency in oral and written communication skills, computer skills, laboratory safety, and logical troubleshooting. Miami University's Chemical Technology II course is set up as a contract analytical laboratory. Students apply the advanced sampling techniques, quality assurance, standard methods, and statistical analyses they have studied. For further integration of workplace skills, weekly "department meetings" are held where the student, as members of the department, report on their work in process, present completed projects, and share what they have learned and what problems they have encountered. Information is shared between the experienced members of the department and those encountering problems or starting a new project. The instructor as department manager makes announcements, reviews company and department status, and assigns work for the coming week. The department members report results to clients in formal reports or in short memos. Factors affecting the success of the "department meeting" approach include the formality of the meeting room, use of an official agenda, the frequency, time, and duration of the meeting, and accountability of the students.
Control by quality: proposition of a typology.
Pujo, P; Pillet, M
The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.
Twenty-five years of maximum-entropy principle
NASA Astrophysics Data System (ADS)
Kapur, J. N.
1983-04-01
The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.
Quality assurance software inspections at NASA Ames: Metrics for feedback and modification
NASA Technical Reports Server (NTRS)
Wenneson, G.
1985-01-01
Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.
Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.
2015-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.
A Formal Methods Approach to the Analysis of Mode Confusion
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.
2004-01-01
The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).
Working the College System: Six Strategies for Building a Personal Powerbase
ERIC Educational Resources Information Center
Simplicio, Joseph S. C.
2008-01-01
Within each college system there are prescribed formalized methods for accomplishing tasks and achieving established goals. To truly understand how a college, or any large organization functions, it is vital to understand the basis of the formal structure. Those individuals who understand formal systems within a college can use this knowledge to…
An Educational Development Tool Based on Principles of Formal Ontology
ERIC Educational Resources Information Center
Guzzi, Rodolfo; Scarpanti, Stefano; Ballista, Giovanni; Di Nicolantonio, Walter
2005-01-01
Computer science provides with virtual laboratories, places where one can merge real experiments with the formalism of algorithms and mathematics and where, with the advent of multimedia, sounds and movies can also be added. In this paper we present a method, based on principles of formal ontology, allowing one to develop interactive educational…
NASA Astrophysics Data System (ADS)
Alimi, Jean-Michel; de Fromont, Paul
2018-04-01
The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.
NASA Astrophysics Data System (ADS)
Devetak, Iztok; Aleksij Glažar, Saša
2010-08-01
Submicrorepresentations (SMRs) are a powerful tool for identifying misconceptions of chemical concepts and for generating proper mental models of chemical phenomena in students' long-term memory during chemical education. The main purpose of the study was to determine which independent variables (gender, formal reasoning abilities, visualization abilities, and intrinsic motivation for learning chemistry) have the maximum influence on students' reading and drawing SMRs. A total of 386 secondary school students (aged 16.3 years) participated in the study. The instruments used in the study were: test of Chemical Knowledge, Test of Logical Thinking, two tests of visualization abilities Patterns and Rotations, and questionnaire on Intrinsic Motivation for Learning Science. The results show moderate, but statistically significant correlations between students' intrinsic motivation, formal reasoning abilities and chemical knowledge at submicroscopic level based on reading and drawing SMRs. Visualization abilities are not statistically significantly correlated with students' success on items that comprise reading or drawing SMRs. It can be also concluded that there is a statistically significant difference between male and female students in solving problems that include reading or drawing SMRs. Based on these statistical results and content analysis of the sample problems, several educational strategies can be implemented for students to develop adequate mental models of chemical concepts on all three levels of representations.
Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method
ERIC Educational Resources Information Center
Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo
2012-01-01
This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)
On the Need for Practical Formal Methods
1998-01-01
additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented
A Vector Representation for Thermodynamic Relationships
ERIC Educational Resources Information Center
Pogliani, Lionello
2006-01-01
The existing vector formalism method for thermodynamic relationship maintains tractability and uses accessible mathematics, which can be seen as a diverting and entertaining step into the mathematical formalism of thermodynamics and as an elementary application of matrix algebra. The method is based on ideas and operations apt to improve the…
Local quantum thermal susceptibility
De Pasquale, Antonella; Rossini, Davide; Fazio, Rosario; Giovannetti, Vittorio
2016-01-01
Thermodynamics relies on the possibility to describe systems composed of a large number of constituents in terms of few macroscopic variables. Its foundations are rooted into the paradigm of statistical mechanics, where thermal properties originate from averaging procedures which smoothen out local details. While undoubtedly successful, elegant and formally correct, this approach carries over an operational problem, namely determining the precision at which such variables are inferred, when technical/practical limitations restrict our capabilities to local probing. Here we introduce the local quantum thermal susceptibility, a quantifier for the best achievable accuracy for temperature estimation via local measurements. Our method relies on basic concepts of quantum estimation theory, providing an operative strategy to address the local thermal response of arbitrary quantum systems at equilibrium. At low temperatures, it highlights the local distinguishability of the ground state from the excited sub-manifolds, thus providing a method to locate quantum phase transitions. PMID:27681458
Local quantum thermal susceptibility
NASA Astrophysics Data System (ADS)
de Pasquale, Antonella; Rossini, Davide; Fazio, Rosario; Giovannetti, Vittorio
2016-09-01
Thermodynamics relies on the possibility to describe systems composed of a large number of constituents in terms of few macroscopic variables. Its foundations are rooted into the paradigm of statistical mechanics, where thermal properties originate from averaging procedures which smoothen out local details. While undoubtedly successful, elegant and formally correct, this approach carries over an operational problem, namely determining the precision at which such variables are inferred, when technical/practical limitations restrict our capabilities to local probing. Here we introduce the local quantum thermal susceptibility, a quantifier for the best achievable accuracy for temperature estimation via local measurements. Our method relies on basic concepts of quantum estimation theory, providing an operative strategy to address the local thermal response of arbitrary quantum systems at equilibrium. At low temperatures, it highlights the local distinguishability of the ground state from the excited sub-manifolds, thus providing a method to locate quantum phase transitions.
Spatial correlations in driven-dissipative photonic lattices
NASA Astrophysics Data System (ADS)
Biondi, Matteo; Lienhard, Saskia; Blatter, Gianni; Türeci, Hakan E.; Schmidt, Sebastian
2017-12-01
We study the nonequilibrium steady-state of interacting photons in cavity arrays as described by the driven-dissipative Bose–Hubbard and spin-1/2 XY model. For this purpose, we develop a self-consistent expansion in the inverse coordination number of the array (∼ 1/z) to solve the Lindblad master equation of these systems beyond the mean-field approximation. Our formalism is compared and benchmarked with exact numerical methods for small systems based on an exact diagonalization of the Liouvillian and a recently developed corner-space renormalization technique. We then apply this method to obtain insights beyond mean-field in two particular settings: (i) we show that the gas–liquid transition in the driven-dissipative Bose–Hubbard model is characterized by large density fluctuations and bunched photon statistics. (ii) We study the antibunching–bunching transition of the nearest-neighbor correlator in the driven-dissipative spin-1/2 XY model and provide a simple explanation of this phenomenon.
A Spatial Dashboard for Alzheimer's Disease in New South Wales.
Robertson, Hamish; Nicholas, Nick; Dhagat, Amit; Travaglia, Joanne
2017-01-01
This paper illustrates a proof of concept scenario for the application of comprehensive data visualisation methods in the rapidly changing aged care sector. The scenario we explored is population ageing and the dementias with an emphasis on the spatial effects of change over time at the Statistical Area 2 (SA2) level for the state of New South Wales. We did this using a combination of methods, culminating in the use of the Tableau software environment to explore the intersections of demography, epidemiology and their formal cost of care implications. In addition, we briefly illustrate how key infrastructure data can be included in the same data management context by showing how service providers can be integrated and mapped in conjunction with other analyses. This is an innovative and practical approach to some of the complex issues already faced in the health and aged care sectors which can only become more pronounced as population ageing progresses.
Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting
Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen; Wald, Lawrence L.
2017-01-01
This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization. PMID:26915119
Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.
Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L
2016-08-01
This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.
NASA Astrophysics Data System (ADS)
Noguere, Gilles; Archier, Pascal; Bouland, Olivier; Capote, Roberto; Jean, Cyrille De Saint; Kopecky, Stefan; Schillebeeckx, Peter; Sirakov, Ivan; Tamagno, Pierre
2017-09-01
A consistent description of the neutron cross sections from thermal energy up to the MeV region is challenging. One of the first steps consists in optimizing the optical model parameters using average resonance parameters, such as the neutron strength functions. They can be derived from a statistical analysis of the resolved resonance parameters, or calculated with the generalized form of the SPRT method by using scattering matrix elements provided by optical model calculations. One of the difficulties is to establish the contributions of the direct and compound nucleus reactions. This problem was solved by using a slightly modified average R-Matrix formula with an equivalent hard sphere radius deduced from the phase shift originating from the potential. The performances of the proposed formalism are illustrated with results obtained for the 238U+n nuclear systems.
Barnes, Geoffrey D; Gu, Xiaokui; Haymart, Brian; Kline-Rogers, Eva; Almany, Steve; Kozlowski, Jay; Besley, Dennis; Krol, Gregory D; Froehlich, James B; Kaatz, Scott
2014-08-01
Guidelines recommend the assessment of stroke and bleeding risk before initiating warfarin anticoagulation in patients with atrial fibrillation. Many of the elements used to predict stroke also overlap with bleeding risk in atrial fibrillation patients and it is tempting to use stroke risk scores to efficiently estimate bleeding risk. Comparison of stroke risk scores to bleeding risk scores to predict bleeding has not been thoroughly assessed. 2600 patients followed at seven anticoagulation clinics were followed from October 2009-May 2013. Five risk models (CHADS2, CHA2DS2-VASc, HEMORR2HAGES, HAS-BLED and ATRIA) were retrospectively applied to each patient. The primary outcome was the first major bleeding event. Area under the ROC curves were compared with C statistic and net reclassification improvement (NRI) analysis was performed. 110 patients experienced a major bleeding event in 2581.6 patient-years (4.5%/year). Mean follow up was 1.0±0.8years. All of the formal bleeding risk scores had a modest predictive value for first major bleeding events (C statistic 0.66-0.69), performing better than CHADS2 and CHA2DS2-VASc scores (C statistic difference 0.10 - 0.16). NRI analysis demonstrated a 52-69% and 47-64% improvement of the formal bleeding risk scores over the CHADS2 score and CHA2DS2-VASc score, respectively. The CHADS2 and CHA2DS2-VASc scores did not perform as well as formal bleeding risk scores for prediction of major bleeding in non-valvular atrial fibrillation patients treated with warfarin. All three bleeding risk scores (HAS-BLED, ATRIA and HEMORR2HAGES) performed moderately well. Copyright © 2014 Elsevier Ltd. All rights reserved.
Operator Approach to the Master Equation for the One-Step Process
NASA Astrophysics Data System (ADS)
Hnatič, M.; Eferina, E. G.; Korolkova, A. V.; Kulyabov, D. S.; Sevastyanov, L. A.
2016-02-01
Background. Presentation of the probability as an intrinsic property of the nature leads researchers to switch from deterministic to stochastic description of the phenomena. The kinetics of the interaction has recently attracted attention because it often occurs in the physical, chemical, technical, biological, environmental, economic, and sociological systems. However, there are no general methods for the direct study of this equation. The expansion of the equation in a formal Taylor series (the so called Kramers-Moyal's expansion) is used in the procedure of stochastization of one-step processes. Purpose. However, this does not eliminate the need for the study of the master equation. Method. It is proposed to use quantum field perturbation theory for the statistical systems (the so-called Doi method). Results: This work is a methodological material that describes the principles of master equation solution based on quantum field perturbation theory methods. The characteristic property of the work is that it is intelligible for non-specialists in quantum field theory. Conclusions: We show the full equivalence of the operator and combinatorial methods of obtaining and study of the one-step process master equation.
NASA Technical Reports Server (NTRS)
Weber, Doug; Jamsek, Damir
1994-01-01
The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.
Qu, Conghui; Schuetz, Johanna M.; Min, Jeong Eun; Leach, Stephen; Daley, Denise; Spinelli, John J.; Brooks-Wilson, Angela; Graham, Jinko
2011-01-01
We describe a statistical approach to predict gender-labeling errors in candidate-gene association studies, when Y-chromosome markers have not been included in the genotyping set. The approach adds value to methods that consider only the heterozygosity of X-chromosome SNPs, by incorporating available information about the intensity of X-chromosome SNPs in candidate genes relative to autosomal SNPs from the same individual. To our knowledge, no published methods formalize a framework in which heterozygosity and relative intensity are simultaneously taken into account. Our method offers the advantage that, in the genotyping set, no additional space is required beyond that already assigned to X-chromosome SNPs in the candidate genes. We also show how the predictions can be used in a two-phase sampling design to estimate the gender-labeling error rates for an entire study, at a fraction of the cost of a conventional design. PMID:22303327
Measuring and monitoring biological diversity: Standard methods for mammals
Wilson, Don E.; Cole, F. Russell; Nichols, James D.; Rudran, Rasanayagam; Foster, Mercedes S.
1996-01-01
Measuring and Monitoring Biological Diversity: Standard Methods for Mammals provides a comprehensive manual for designing and implementing inventories of mammalian biodiversity anywhere in the world and for any group, from rodents to open-country grazers. The book emphasizes formal estimation approaches, which supply data that can be compared across habitats and over time. Beginning with brief natural histories of the twenty-six orders of living mammals, the book details the field techniques—observation, capture, and sign interpretation—appropriate to different species. The contributors provide guidelines for study design, discuss survey planning, describe statistical techniques, and outline methods of translating field data into electronic formats. Extensive appendixes address such issues as the ethical treatment of animals in research, human health concerns, preserving voucher specimens, and assessing age, sex, and reproductive condition in mammals.Useful in both developed and developing countries, this volume and the Biological Diversity Handbook Series as a whole establish essential standards for a key aspect of conservation biology and resource management.
Model selection and assessment for multi-species occupancy models
Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.
2016-01-01
While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.
Soyyılmaz, Demet; Griffin, Laura M.; Martín, Miguel H.; Kucharský, Šimon; Peycheva, Ekaterina D.; Vaupotič, Nina; Edelsbrunner, Peter A.
2017-01-01
Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students’ development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students’ need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students’ learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students’ scientific thinking. PMID:28239363
Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states
NASA Astrophysics Data System (ADS)
dell'Anno, Fabio; de Siena, Silvio; Illuminati, Fabrizio
2004-03-01
We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n -mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherence and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [
Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio
2004-03-01
We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n-mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherencemore » and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [F. Dell'Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.« less
Soyyılmaz, Demet; Griffin, Laura M; Martín, Miguel H; Kucharský, Šimon; Peycheva, Ekaterina D; Vaupotič, Nina; Edelsbrunner, Peter A
2017-01-01
Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students' development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students' need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students' learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students' scientific thinking.
A spatial exploration of informal trail networks within Great Falls Park, VA
Wimpey, Jeremy; Marion, Jeffrey L.
2011-01-01
Informal (visitor-created) trails represent a threat to the natural resources of protected natural areas around the globe. These trails can remove vegetation, displace wildlife, alter hydrology, alter habitat, spread invasive species, and fragment landscapes. This study examines informal and formal trails within Great Falls Park, VA, a sub-unit of the George Washington Memorial Parkway, managed by the U.S. National Park Service. This study sought to answer three specific questions: 1) Are the physical characteristics and topographic alignments of informal trails significantly different from formal trails, 2) Can landscape fragmentation metrics be used to summarize the relative impacts of formal and informal trail networks on a protected natural area? and 3) What can we learn from examining the spatial distribution of the informal trails within protected natural areas? Statistical comparisons between formal and informal trails in this park indicate that informal trails have less sustainable topographic alignments than their formal counterparts. Spatial summaries of the lineal and areal extent and fragmentation associated with the trail networks by park management zones compare park management goals to the assessed attributes. Hot spot analyses highlight areas of high trail density within the park and findings provide insights regarding potential causes for development of dense informal trail networks.
Formal Requirements-Based Programming for Complex Systems
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis
2005-01-01
Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.
What can formal methods offer to digital flight control systems design
NASA Technical Reports Server (NTRS)
Good, Donald I.
1990-01-01
Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.
Galaxy Redshifts from Discrete Optimization of Correlation Functions
NASA Astrophysics Data System (ADS)
Lee, Benjamin C. G.; Budavári, Tamás; Basu, Amitabh; Rahman, Mubdi
2016-12-01
We propose a new method of constraining the redshifts of individual extragalactic sources based on celestial coordinates and their ensemble statistics. Techniques from integer linear programming (ILP) are utilized to optimize simultaneously for the angular two-point cross- and autocorrelation functions. Our novel formalism introduced here not only transforms the otherwise hopelessly expensive, brute-force combinatorial search into a linear system with integer constraints but also is readily implementable in off-the-shelf solvers. We adopt Gurobi, a commercial optimization solver, and use Python to build the cost function dynamically. The preliminary results on simulated data show potential for future applications to sky surveys by complementing and enhancing photometric redshift estimators. Our approach is the first application of ILP to astronomical analysis.
Energy Exchange in Driven Open Quantum Systems at Strong Coupling
NASA Astrophysics Data System (ADS)
Carrega, Matteo; Solinas, Paolo; Sassetti, Maura; Weiss, Ulrich
2016-06-01
The time-dependent energy transfer in a driven quantum system strongly coupled to a heat bath is studied within an influence functional approach. Exact formal expressions for the statistics of energy dissipation into the different channels are derived. The general method is applied to the driven dissipative two-state system. It is shown that the energy flows obey a balance relation, and that, for strong coupling, the interaction may constitute the major dissipative channel. Results in analytic form are presented for the particular value K =1/2 of strong Ohmic dissipation. The energy flows show interesting behaviors including driving-induced coherences and quantum stochastic resonances. It is found that the general characteristics persists for K near 1/2 .
NASA Astrophysics Data System (ADS)
Donnay, Karsten
2015-03-01
The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].
Who Cares? Infant Educators' Responses to Professional Discourses of Care
ERIC Educational Resources Information Center
Davis, Belinda; Degotardi, Sheila
2015-01-01
This paper explores the construction of "care" in early childhood curriculum and practice. An increasing number of infants are attending formal early childhood settings in Australia (Australian Bureau of Statistics, 2011. "Childhood education and care, Australia, June 2011." (4402.0). Retrieved from…
Tertiary Education and Training in Australia, 2010
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2012
2012-01-01
This publication presents information on tertiary education and training during 2010, including statistics on participation and outcomes. The definition of tertiary education and training adopted for this publication is formal study in vocational education and training (VET) and higher education, including enrolments in Australian Qualifications…
NASA Astrophysics Data System (ADS)
Reimberg, Paulo; Bernardeau, Francis
2018-01-01
We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Apes are intuitive statisticians.
Rakoczy, Hannes; Clüver, Annette; Saucke, Liane; Stoffregen, Nicole; Gräbener, Alice; Migura, Judith; Call, Josep
2014-04-01
Inductive learning and reasoning, as we use it both in everyday life and in science, is characterized by flexible inferences based on statistical information: inferences from populations to samples and vice versa. Many forms of such statistical reasoning have been found to develop late in human ontogeny, depending on formal education and language, and to be fragile even in adults. New revolutionary research, however, suggests that even preverbal human infants make use of intuitive statistics. Here, we conducted the first investigation of such intuitive statistical reasoning with non-human primates. In a series of 7 experiments, Bonobos, Chimpanzees, Gorillas and Orangutans drew flexible statistical inferences from populations to samples. These inferences, furthermore, were truly based on statistical information regarding the relative frequency distributions in a population, and not on absolute frequencies. Intuitive statistics in its most basic form is thus an evolutionarily more ancient rather than a uniquely human capacity. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ayu Nurul Handayani, Hemas; Waspada, Indra
2018-05-01
Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.
Generalizability of the Ordering among Five Formal Reasoning Tasks by an Ordering-Theoretic Method.
ERIC Educational Resources Information Center
Bart, William M.; And Others
1979-01-01
Five Inhelder-Piaget formal operations tasks were analyzed to determine the extent that the formal operational skills they assess were ordered into a stable hierarchy generalizable across samples of subjects. Subjects were 34 collegiate gymnasts (19 males, 15 females), and 22 students (1 male, 21 females) from a university nursing program.…
ERIC Educational Resources Information Center
Goldratt, Miri; Cohen, Eric H.
2016-01-01
This article explores encounters between formal, informal, and non-formal education and the role of mentor-educators in creating values education in which such encounters take place. Mixed-methods research was conducted in Israeli public schools participating in the Personal Education Model, which combines educational modes. Ethnographic and…
NASA Technical Reports Server (NTRS)
Sundqvist, Jon O.; Owocki, Stanley P.; Cohen, David H.; Leutenegger, Maurice A.; Townsend, Richard H. D.
2002-01-01
We present a generalised formalism for treating the porosity-associated reduction in continuum opacity that occurs when individual clumps in a stochastic medium become optically thick. As in previous work, we concentrate on developing bridging laws between the limits of optically thin and thick clumps. We consider geometries resulting in either isotropic or anisotropic effective opacity, and, in addition to an idealised model in which all clumps have the same local overdensity and scale, we also treat an ensemble of clumps with optical depths set by Markovian statistics. This formalism is then applied to the specific case of bound-free absorption of X- rays in hot star winds, a process not directly affected by clumping in the optically thin limit. We find that the Markov model gives surprisingly similar results to those found previously for the single clump model, suggesting that porous opacity is not very sensitive to details of the assumed clump distribution function. Further, an anisotropic effective opacity favours escape of X-rays emitted in the tangential direction (the venetian blind effect), resulting in a bump of higher flux close to line centre as compared to profiles computed from isotropic porosity models. We demonstrate how this characteristic line shape may be used to diagnose the clump geometry, and we confirm previous results that for optically thick clumping to significantly influence X-ray line profiles, very large porosity lengths, defined as the mean free path between clumps, are required. Moreover, we present the first X-ray line profiles computed directly from line-driven instability simulations using a 3-D patch method, and find that porosity effects from such models also are very small. This further supports the view that porosity has, at most, a marginal effect on X-ray line diagnostics in O stars, and therefore that these diagnostics do indeed provide a good clumping insensitive method for deriving O star mass-loss rates.
NASA Technical Reports Server (NTRS)
Sundqvist, Jon O.; Owocki, Stanley P.; Cohen, David H.; Leutenegger, Maurice A.
2011-01-01
We present a generalised formalism for treating the porosity-associated reduction in continuum opacity that occurs when individual clumps in a stochastic medium become optically thick. As in previous work, we concentrate on developing bridging laws between the limits of optically thin and thick clumps. We consider geometries resulting in either isotropic or anisotropic effective opacity, and, in addition to an idealised model in which all clumps have the same local overdensity and scale, we also treat an ensemble of clumps with optical depths set by Markovian statistics. This formalism is then applied to the specific case of bound-free absorption of X- rays in hot star winds, a process not directly affected by clumping in the optically thin limit. We find that the Markov model gives surprisingly similar results to those found previously for the single clump model, suggesting that porous opacity is not very sensitive to details of the assumed clump distribution function. Further, an anisotropic effective opacity favours escape of X-rays emitted in the tangential direction (the venetian blind effect), resulting in a bump of higher flux close to line centre as compared to profiles computed from isotropic porosity models. We demonstrate how this characteristic line shape may be used to diagnose the clump geometry, and we confirm previous results that for optically thick clumping to significantly influence X-ray line profiles, very large porosity lengths, defined as the mean free path between clumps, are required. Moreover, we present the first X-ray line profiles computed directly from line-driven instability simulations using a 3-D patch method, and find that porosity effects from such models also are very small. This further supports the view that porosity has, at most, a marginal effect on X-ray line diagnostics in O stars, and therefore that these diagnostics do indeed provide a good clumping insensitive method for deriving O star mass-loss rates.
Alexandraki, Irene; Hernandez, Caridad A; Torre, Dario M; Chretien, Katherine C
2017-08-01
Several decades of work have detailed the value and goals of interprofessional education (IPE) within the health professions, defining IPE competencies and best practices. In 2013, the Liaison Committee for Medical Education (LCME) elevated IPE to a U.S. medical school accreditation standard. To examine the status of IPE within internal medicine (IM) clerkships including perspectives, curricular content, barriers, and assessment a year after the LCME standard issuance. Anonymous online survey. IM clerkship directors from each of the Clerkship Directors in Internal Medicine's 121 U.S. and Canadian member medical schools in 2014. In 2014, a section on IPE (18 items) was included in the Clerkship Directors in Internal Medicine annual survey of its 121 U.S. and Canadian member medical schools. Items (18) assessed clerkship director (CD) perspectives, status of IPE curricula in IM clerkships, and barriers to IPE implementation. Data were analyzed using descriptive statistics and qualitative analysis of free-text responses to one of the survey questions. The overall survey response rate was 78% (94/121). The majority (88%) agreed that IPE is important to the practice of IM, and 71% believed IPE should be part of the IM clerkship. Most (76%) CDs agreed there is need for faculty development programs in IPE; 27% had such a program at their institution. Lack of curricular time, scheduling conflicts, and lack of faculty trained in IPE were the most frequently cited barriers. Twenty-nine percent had formal IPE activities within their IM clerkships, and 38% were planning to make changes. Of those with formal IPE activities, over a third (37%) did not involve student assessment. Since LCME standard issuance, only a minority of IM clerkships have included formal IPE activities, with lectures as the predominant method. Opportunities exist for enhancing educational methods as well as IPE faculty development.
Formal Method of Description Supporting Portfolio Assessment
ERIC Educational Resources Information Center
Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou
2006-01-01
Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…
Modulatory Effect of Inflammation on Blood Pressure Reduction via Therapeutic Lifestyle Change
Milani, Richard V.; Lavie, Carl J.
2009-01-01
Purpose: Since inflammatory status, as determined by C-reactive protein (CRP) levels, is correlated with many cardiovascular (CV) disease risk factors and major CV events, we sought to determine if median levels of CRP can modulate blood pressure changes as well as other CV risk factors that are typically improved by therapeutic lifestyle changes with formal cardiac rehabilitation and exercise training (CRET) programs. Methods: We retrospectively evaluated CRP status and standard CV risk factors both before and after formal, phase II CRET programs (12 weeks; 36 educational and exercise sessions) in 635 consecutive patients with coronary artery disease after major CV events. Results: The median CRP level at baseline was 3.2 mg/L (range, 0.2–80.1 mg/L; mean, 5.8±8.4 mg/L). After CRET, both the patients with high and those with low CRP concentrations exhibited statistically significant improvements in most CV risk factors when their CRP levels were divided by median levels. However, systolic, diastolic, and mean arterial blood pressure improved in patients with low CRP levels (each by −4%) but did not change significantly in patients with high CRP levels. In multiple regression models, only young age, low CRP levels, and low body mass index were significant independent predictors of improved mean arterial blood pressure after CRET. Conclusions: In contrast to patients with coronary artery disease and low levels of CRP, patients with high baseline CRP levels did not demonstrate significant reductions in blood pressure after therapeutic lifestyle changes via formal CRET programs. PMID:21603441
Jordan, Jaime; Coates, Wendy C; Clarke, Samuel; Runde, Daniel; Fowlkes, Emilie; Kurth, Jaqueline; Yarris, Lalena
2018-05-01
Educators and education researchers report that their scholarship is limited by lack of time, funding, mentorship, expertise, and reward. This study aims to evaluate these groups' perceptions regarding barriers to scholarship and potential strategies for success. Core emergency medicine (EM) educators and education researchers completed an online survey consisting of multiple-choice, 10-point Likert scale, and free-response items in 2015. Descriptive statistics were reported. We used qualitative analysis applying a thematic approach to free-response items. A total of 204 educators and 42 education researchers participated. Education researchers were highly productive: 19/42 reported more than 20 peer-reviewed education scholarship publications on their curricula vitae. In contrast, 68/197 educators reported no education publications within five years. Only a minority, 61/197 had formal research training compared to 25/42 education researchers. Barriers to performing research for both groups were lack of time, competing demands, lack of support, lack of funding, and challenges achieving scientifically rigorous methods and publication. The most common motivators identified were dissemination of knowledge, support of evidence-based practices, and promotion. Respondents advised those who seek greater education research involvement to pursue mentorship, formal research training, collaboration, and rigorous methodological standards. The most commonly cited barriers were lack of time and competing demands. Stakeholders were motivated by the desire to disseminate knowledge, support evidence-based practices, and achieve promotion. Suggested strategies for success included formal training, mentorship, and collaboration. This information may inform interventions to support educators in their scholarly pursuits and improve the overall quality of education research in EM.
Noe, R; Rocha, J; Clavel-Arcas, C; Aleman, C; Gonzales, M; Mock, C
2004-01-01
Objectives: To identify and describe the work related injuries in both the formal and informal work sectors captured in an emergency department based injury surveillance system in Managua, Nicaragua. Setting: Urban emergency department in Managua, Nicaragua serving 200–300 patients per day. Methods: Secondary analysis from the surveillance system data. All cases indicating an injury while working and seen for treatment at the emergency department between 1 August 2001 and 31 July 2002 were included. There was no exclusion based on place of occurrence (home, work, school), age, or gender. Results: There were 3801 work related injuries identified which accounted for 18.6% of the total 20 425 injures captured by the surveillance system. Twenty seven work related fatalities were recorded, compared with the 1998 International Labor Organization statistic of 25 occupational fatalities for all of Nicaragua. Injuries occurring outside of a formal work location accounted for more than 60% of the work related injuries. Almost half of these occurred at home, while 19% occurred on the street. The leading mechanisms for work related injuries were falls (30%), blunt objects (28%), and stabs/cuts (23%). Falls were by far the most severe mechanism in the study, causing 37% of the work related deaths and more than half of the fractures. Conclusions: Occupational injuries are grossly underreported in Nicaragua. This study demonstrated that an emergency department can be a data source for work related injuries in developing countries because it captures both the formal and informal workforce injuries. Fall prevention initiatives could significantly reduce the magnitude and severity of occupational injuries in Managua, Nicaragua. PMID:15314050
Testing for voter rigging in small polling stations
Jimenez, Raúl; Hidalgo, Manuel; Klimek, Peter
2017-01-01
Nowadays, a large number of countries combine formal democratic institutions with authoritarian practices. Although in these countries the ruling elites may receive considerable voter support, they often use several manipulation tools to control election outcomes. A common practice of these regimes is the coercion and mobilization of large numbers of voters. This electoral irregularity is known as voter rigging, distinguishing it from vote rigging, which involves ballot stuffing or stealing. We develop a statistical test to quantify the extent to which the results of a particular election display traces of voter rigging. Our key hypothesis is that small polling stations are more susceptible to voter rigging because it is easier to identify opposing individuals, there are fewer eyewitnesses, and interested parties might reasonably expect fewer visits from election observers. We devise a general statistical method for testing whether voting behavior in small polling stations is significantly different from the behavior in their neighbor stations in a way that is consistent with the widespread occurrence of voter rigging. On the basis of a comparative analysis, the method enables third parties to conclude that an explanation other than simple variability is needed to explain geographic heterogeneities in vote preferences. We analyze 21 elections in 10 countries and find significant statistical anomalies compatible with voter rigging in Russia from 2007 to 2011, in Venezuela from 2006 to 2013, and in Uganda in 2011. Particularly disturbing is the case of Venezuela, where the smallest polling stations were decisive to the outcome of the 2013 presidential elections. PMID:28695193
Testing for voter rigging in small polling stations.
Jimenez, Raúl; Hidalgo, Manuel; Klimek, Peter
2017-06-01
Nowadays, a large number of countries combine formal democratic institutions with authoritarian practices. Although in these countries the ruling elites may receive considerable voter support, they often use several manipulation tools to control election outcomes. A common practice of these regimes is the coercion and mobilization of large numbers of voters. This electoral irregularity is known as voter rigging, distinguishing it from vote rigging, which involves ballot stuffing or stealing. We develop a statistical test to quantify the extent to which the results of a particular election display traces of voter rigging. Our key hypothesis is that small polling stations are more susceptible to voter rigging because it is easier to identify opposing individuals, there are fewer eyewitnesses, and interested parties might reasonably expect fewer visits from election observers. We devise a general statistical method for testing whether voting behavior in small polling stations is significantly different from the behavior in their neighbor stations in a way that is consistent with the widespread occurrence of voter rigging. On the basis of a comparative analysis, the method enables third parties to conclude that an explanation other than simple variability is needed to explain geographic heterogeneities in vote preferences. We analyze 21 elections in 10 countries and find significant statistical anomalies compatible with voter rigging in Russia from 2007 to 2011, in Venezuela from 2006 to 2013, and in Uganda in 2011. Particularly disturbing is the case of Venezuela, where the smallest polling stations were decisive to the outcome of the 2013 presidential elections.
Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.
Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani
2007-03-01
Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.
General practitioners’ views of clinically led commissioning: cross-sectional survey in England
Moran, Valerie; Checkland, Kath; Coleman, Anna; Spooner, Sharon; Gibson, Jonathan; Sutton, Matt
2017-01-01
Objectives Involving general practitioners (GPs) in the commissioning/purchasing of services has been an important element in English health policy for many years. The Health and Social Care Act 2012 handed responsibility for commissioning of the majority of care for local populations to GP-led Clinical Commissioning Groups (CCGs). In this paper, we explore GP attitudes to involvement in commissioning and future intentions for engagement. Design and setting Survey of a random sample of GPs across England in 2015. Method The Eighth National GP Worklife Survey was distributed to GPs in spring 2015. Responses were received from 2611 respondents (response rate = 46%). We compared responses across different GP characteristics and conducted two sample tests of proportions to identify statistically significant differences in responses across groups. We also used multivariate logistic regression to identify the characteristics associated with wanting a formal CCG role in the future. Results While GPs generally agree that they can add value to aspects of commissioning, only a minority feel that this is an important part of their role. Many current leaders intend to quit in the next 5 years, and there is limited appetite among those not currently in a formal role to take up such a role in the future. CCGs were set up as ‘membership organisations’ but only a minority of respondents reported feeling that they had ‘ownership’ of their local CCG and these were often GPs with formal CCG roles. However, respondents generally agree that the CCG has a legitimate role in influencing the work that they do. Conclusion CCGs need to engage in active succession planning to find the next generation of GP leaders. GPs believe that CCGs have a legitimate role in influencing their work, suggesting that there may be scope for CCGs to involve GPs more fully in roles short of formal leadership. PMID:28596217
Adams, Alayne M; Islam, Rubana; Ahmed, Tanvir
2015-01-01
In Bangladesh, the health risks of unplanned urbanization are disproportionately shouldered by the urban poor. At the same time, affordable formal primary care services are scarce, and what exists is almost exclusively provided by non-government organizations (NGOs) working on a project basis. So where do the poor go for health care? A health facility mapping of six urban slum settlements in Dhaka was undertaken to explore the configuration of healthcare services proximate to where the poor reside. Three methods were employed: (1) Social mapping and listing of all Health Service Delivery Points (HSDPs); (2) Creation of a geospatial map including Global Positioning System (GPS) co-ordinates of all HSPDs in the six study areas and (3) Implementation of a facility survey of all HSDPs within six study areas. Descriptive statistics are used to examine the number, type and concentration of service provider types, as well as indicators of their accessibility in terms of location and hours of service. A total of 1041 HSDPs were mapped, of which 80% are privately operated and the rest by NGOs and the public sector. Phamacies and non-formal or traditional doctors make up 75% of the private sector while consultation chambers account for 20%. Most NGO and Urban Primary Health Care Project (UPHCP) static clinics are open 5–6 days/week, but close by 4–5 pm in the afternoon. Evening services are almost exclusively offered by private HSDPs; however, only 37% of private sector health staff possess some kind of formal medical qualification. This spatial analysis of health service supply in poor urban settlements emphasizes the importance of taking the informal private sector into account in efforts to increase effective coverage of quality services. Features of informal private sector service provision that have facilitated market penetration may be relevant in designing formal services that better meet the needs of the urban poor. PMID:25759453
NASA Astrophysics Data System (ADS)
Fitzpatrick, Matthew R. C.; Kennett, Malcolm P.
2018-05-01
We develop a formalism that allows the study of correlations in space and time in both the superfluid and Mott insulating phases of the Bose-Hubbard Model. Specifically, we obtain a two particle irreducible effective action within the contour-time formalism that allows for both equilibrium and out of equilibrium phenomena. We derive equations of motion for both the superfluid order parameter and two-point correlation functions. To assess the accuracy of this formalism, we study the equilibrium solution of the equations of motion and compare our results to existing strong coupling methods as well as exact methods where possible. We discuss applications of this formalism to out of equilibrium situations.
On testing for spatial correspondence between maps of human brain structure and function.
Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin
2018-06-01
A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.
Medical statistics and hospital medicine: the case of the smallpox vaccination.
Rusnock, Andrea
2007-01-01
Between 1799 and 1806, trials of vaccination to determine its safety and efficacy were undertaken in hospitals in London, Paris, Vienna, and Boston. These trials were among the first instances of formal hospital evaluations of a medical procedure and signal a growing acceptance of a relatively new approach to medical practice. These early evaluations of smallpox vaccination also relied on descriptive and quantitative accounts, as well as probabilistic analyses, and thus occupy a significant, yet hitherto unexamined, place in the history of medical statistics.
Orchestrating high-throughput genomic analysis with Bioconductor
Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin
2015-01-01
Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503
Formal methods demonstration project for space applications
NASA Technical Reports Server (NTRS)
Divito, Ben L.
1995-01-01
The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.
Mayer, B; Muche, R
2013-01-01
Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.
Collective Bargaining: Its Impact on Educational Cost.
ERIC Educational Resources Information Center
Atherton, P. J.
Since the Ontario (Canada) legislation in 1975 that formalized collective bargaining for teachers, public concern has focused on collective bargaining as the possible cause of recent enrollment declines and increases in schooling costs. However, according to Ontario provincial statistics, enrollment in elementary schools had begun to decline…
Heuristic Elements of Plausible Reasoning.
ERIC Educational Resources Information Center
Dudczak, Craig A.
At least some of the reasoning processes involved in argumentation rely on inferences which do not fit within the traditional categories of inductive or deductive reasoning. The reasoning processes involved in plausibility judgments have neither the formal certainty of deduction nor the imputed statistical probability of induction. When utilizing…
Building Intuitions about Statistical Inference Based on Resampling
ERIC Educational Resources Information Center
Watson, Jane; Chance, Beth
2012-01-01
Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on…
ERIC Educational Resources Information Center
Chunko, John A.
1973-01-01
LSD NOW is a nationwide, statistical survey and analysis of hallucinogenic drug use by individuals presently in formal educational surroundings. Analysis, concentrating on the extent and rationale related to the use of such drugs, now offers a deeper and more meaningful understanding of a particular facet of the drug culture. This understanding…
Love and Sex: Can We Talk About That in School?
ERIC Educational Resources Information Center
Vance, Paul C.
1985-01-01
Gives statistical information on the "national epidemic" of teenage sexual activity and pregnancy and its consequences. Discusses social causes of this problem. Proposes that schools can help solve the problem by providing a formal sex education curriculum for pupils in kindergarten through grade 12. (CB)
Approaching Bose-Einstein Condensation
ERIC Educational Resources Information Center
Ferrari, Loris
2011-01-01
Bose-Einstein condensation (BEC) is discussed at the level of an advanced course of statistical thermodynamics, clarifying some formal and physical aspects that are usually not covered by the standard pedagogical literature. The non-conventional approach adopted starts by showing that the continuum limit, in certain cases, cancels out the crucial…
Unmanned Aircraft Systems in the National Airspace System: A Formal Methods Perspective
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Dutle, Aaron; Narkawicz, Anthony; Upchurch, Jason
2016-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) have grown, so too have international efforts to integrate UAS into civil airspace. However, one of the major concerns that must be addressed in realizing this integration is that of safety. For example, UAS lack an on-board pilot to comply with the legal requirement that pilots see and avoid other aircraft. This requirement has motivated the development of a detect and avoid (DAA) capability for UAS that provides situational awareness and maneuver guidance to UAS operators to aid them in avoiding and remaining well clear of other aircraft in the airspace. The NASA Langley Research Center Formal Methods group has played a fundamental role in the development of this capability. This article gives a selected survey of the formal methods work conducted in support of the development of a DAA concept for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations.
Memory sparing, fast scattering formalism for rigorous diffraction modeling
NASA Astrophysics Data System (ADS)
Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.
2017-07-01
The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.
Egidi, Franco; Sun, Shichao; Goings, Joshua J; Scalmani, Giovanni; Frisch, Michael J; Li, Xiaosong
2017-06-13
We present a linear response formalism for the description of the electronic excitations of a noncollinear reference defined via Kohn-Sham spin density functional methods. A set of auxiliary variables, defined using the density and noncollinear magnetization density vector, allows the generalization of spin density functional kernels commonly used in collinear DFT to noncollinear cases, including local density, GGA, meta-GGA and hybrid functionals. Working equations and derivations of functional second derivatives with respect to the noncollinear density, required in the linear response noncollinear TDDFT formalism, are presented in this work. This formalism takes all components of the spin magnetization into account independent of the type of reference state (open or closed shell). As a result, the method introduced here is able to afford a nonzero local xc torque on the spin magnetization while still satisfying the zero-torque theorem globally. The formalism is applied to a few test cases using the variational exact-two-component reference including spin-orbit coupling to illustrate the capabilities of the method.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.
ERIC Educational Resources Information Center
Ugwu, Chinwe U.
2015-01-01
The National Commission for Mass Literacy, Adult and Non-Formal Education (NMEC) is the Federal Statutory Agency set up to co-ordinate all aspects of Non-Formal Education in Nigeria whether offered by government agencies or non-governmental organisations. This study looked at the existing Capacity Building Programme, the delivery methods, impact…
ERIC Educational Resources Information Center
Penning, Margaret J.
2002-01-01
Purpose: In response to concerns among policymakers and others that increases in the availability of publicly funded formal services will lead to reductions in self- and informal care, this study examines the relationship between the extent of formal in-home care received and levels of self- and informal care. Design and Methods: Two-stage least…
The Influence of Rural Location on Utilization of Formal Home Care: The Role of Medicaid
ERIC Educational Resources Information Center
McAuley, William J.; Spector, William D.; Van Nostrand, Joan; Shaffer, Tom
2004-01-01
Purpose: This research examines the impact of rural-urban residence on formal home-care utilization among older people and determines whether and how Medicaid coverage influences the association between, rural-urban location and risk of formal home-care use. Design and Methods: We combined data from the 1998 consolidated file of the Medical…
Statistical inference of the generation probability of T-cell receptors from sequence repertoires.
Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G
2012-10-02
Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.
Testing New Physics with the Cosmic Microwave Background
NASA Astrophysics Data System (ADS)
Gluscevic, Vera
2013-01-01
In my thesis work, I have developed and applied tests of new fundamental physics that utilize high-precision CMB polarization measurements. I especially focused on a wide class of dark energy models that propose existence of new scalar fields to explain accelerated expansion of the Universe. Such fields naturally exhibit a weak interaction with photons, giving rise to "cosmic birefringence"---a rotation of the polarization plane of light traveling cosmological distances, which alters the statistics of the CMB fluctuations in the sky by inducing a characteristic B-mode polarization. A birefringent rotation of the CMB would be smoking-gun evidence that dark energy is a dynamical component rather than a cosmological constant, while its absence gives clues about the allowed regions of the parameter space for new models. I developed a full-sky formalism to search for cosmic birefringence by cross-correlating CMB temperature and polarization maps, after allowing for the rotation angle to vary across the sky. With my collaborators, I also proposed a cross-correlation of the rotation-angle estimator with the CMB temperature as a novel statistical probe which can boost signal-to-noise in the case of marginal detection and help disentangle the underlying physical models. I then investigated the degeneracy between the rotation signal and the signals from other exotic scenarios that induce a similar B-mode polarization signature, such as chiral primordial gravitational waves, and demonstrated that these effects are completely separable. Finally, I applied this formalism to WMAP-7 data and derived the first CMB constraint on the power spectrum of the birefringent-rotation angle and presented forecasts for future experiments. To demonstrate the value of this analysis method beyond the search for direction-dependent cosmic birefringence, I have also used it to probe patchy screening from the epoch of cosmic reionization with WMAP-7 data.
Follow-up of cancer in primary care versus secondary care: systematic review
Lewis, Ruth A; Neal, Richard D; Williams, Nefyn H; France, Barbara; Hendry, Maggie; Russell, Daphne; Hughes, Dyfrig A; Russell, Ian; Stuart, Nicholas SA; Weller, David; Wilkinson, Clare
2009-01-01
Background Cancer follow-up has traditionally been undertaken in secondary care, but there are increasing calls to deliver it in primary care. Aim To compare the effectiveness and cost-effectiveness of primary versus secondary care follow-up of cancer patients, determine the effectiveness of the integration of primary care in routine hospital follow-up, and evaluate the impact of patient-initiated follow-up on primary care. Design of study Systematic review. Setting Primary and secondary care settings. Method A search was carried out of 19 electronic databases, online trial registries, conference proceedings, and bibliographies of included studies. The review included comparative studies or economic evaluations of primary versus secondary care follow-up, hospital follow-up with formal primary care involvement versus conventional hospital follow-up, and hospital follow-up versus patient-initiated or minimal follow-up if the study reported the impact on primary care. Results There was no statistically significant difference for patient wellbeing, recurrence rate, survival, recurrence-related serious clinical events, diagnostic delay, or patient satisfaction. GP-led breast cancer follow-up was cheaper than hospital follow-up. Intensified primary health care resulted in increased home-care nurse contact, and improved discharge summary led to increased GP contact. Evaluation of patient-initiated or minimal follow-up found no statistically significant impact on the number of GP consultations or cancer-related referrals. Conclusion Weak evidence suggests that breast cancer follow-up in primary care is effective. Interventions improving communication between primary and secondary care could lead to greater GP involvement. Discontinuation of formal follow-up may not increase GP workload. However, the quality of the data in general was poor, and no firm conclusions can be reached. PMID:19566990
Boundary based on exchange symmetry theory for multilevel simulations. I. Basic theory.
Shiga, Motoyuki; Masia, Marco
2013-07-28
In this paper, we lay the foundations for a new method that allows multilevel simulations of a diffusive system, i.e., a system where a flux of particles through the boundaries might disrupt the primary region. The method is based on the use of flexible restraints that maintain the separation between inner and outer particles. It is shown that, by introducing a bias potential that accounts for the exchange symmetry of the system, the correct statistical distribution is preserved. Using a toy model consisting of non-interacting particles in an asymmetric potential well, we prove that the method is formally exact, and that it could be simplified by considering only up to a couple of particle exchanges without a loss of accuracy. A real-world test is then made by considering a hybrid MM(∗)/MM calculation of cesium ion in water. In this case, the single exchange approximation is sound enough that the results superimpose to the exact solutions. Potential applications of this method to many different hybrid QM/MM systems are discussed, as well as its limitations and strengths in comparison to existing approaches.
Nonequilibrium Green's function method for quantum thermal transport
NASA Astrophysics Data System (ADS)
Wang, Jian-Sheng; Agarwalla, Bijay Kumar; Li, Huanan; Thingna, Juzar
2014-12-01
This review deals with the nonequilibrium Green's function (NEGF) method applied to the problems of energy transport due to atomic vibrations (phonons), primarily for small junction systems. We present a pedagogical introduction to the subject, deriving some of the well-known results such as the Laudauer-like formula for heat current in ballistic systems. The main aim of the review is to build the machinery of the method so that it can be applied to other situations, which are not directly treated here. In addition to the above, we consider a number of applications of NEGF, not in routine model system calculations, but in a few new aspects showing the power and usefulness of the formalism. In particular, we discuss the problems of multiple leads, coupled left-right-lead system, and system without a center. We also apply the method to the problem of full counting statistics. In the case of nonlinear systems, we make general comments on the thermal expansion effect, phonon relaxation time, and a certain class of mean-field approximations. Lastly, we examine the relationship between NEGF, reduced density matrix, and master equation approaches to thermal transport.
Fienup, Daniel M; Critchfield, Thomas S
2010-01-01
Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904
Lopez, Joseph; Ameri, Afshin; Susarla, Srinivas M; Reddy, Sashank; Soni, Ashwin; Tong, J W; Amini, Neda; Ahmed, Rizwan; May, James W; Lee, W P Andrew; Dorafshar, Amir
2016-01-01
It is currently unknown whether formal research training has an influence on academic advancement in plastic surgery. The purpose of this study was to determine whether formal research training was associated with higher research productivity, academic rank, and procurement of extramural National Institutes of Health (NIH) funding in plastic surgery, comparing academic surgeons who completed said research training with those without. This was a cross-sectional study of full-time academic plastic surgeons in the United States. The main predictor variable was formal research training, defined as completion of a postdoctoral research fellowship or attainment of a Doctor of Philosophy (PhD). The primary outcome was scientific productivity measured by the Hirsh-index (h-index, the number of publications, h that have at least h citations each). The secondary outcomes were academic rank and NIH funding. Descriptive, bivariate, and multiple regression statistics were computed. A total of 607 academic surgeons were identified from 94 Accreditation Council for Graduate Medical Education-accredited plastic surgery training programs. In all, 179 (29.5%) surgeons completed formal research training. The mean h-index was 11.7 ± 9.9. And, 58 (9.6%) surgeons successfully procured NIH funding. The distribution of academic rank was the following: endowed professor (5.4%), professor (23.9%), associate professor (23.4%), assistant professor (46.0%), and instructor (1.3%). In a multiple regression analysis, completion of formal research training was significantly predictive of a higher h-index and successful procurement of NIH funding. Current evidence demonstrates that formal research training is associated with higher scientific productivity and increased likelihood of future NIH funding. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Johnson, Christopher W.
1996-01-01
The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.
Does History Repeat Itself? Wavelets and the Phylodynamics of Influenza A
Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.
2012-01-01
Unprecedented global surveillance of viruses will result in massive sequence data sets that require new statistical methods. These data sets press the limits of Bayesian phylogenetics as the high-dimensional parameters that comprise a phylogenetic tree increase the already sizable computational burden of these techniques. This burden often results in partitioning the data set, for example, by gene, and inferring the evolutionary dynamics of each partition independently, a compromise that results in stratified analyses that depend only on data within a given partition. However, parameter estimates inferred from these stratified models are likely strongly correlated, considering they rely on data from a single data set. To overcome this shortfall, we exploit the existing Monte Carlo realizations from stratified Bayesian analyses to efficiently estimate a nonparametric hierarchical wavelet-based model and learn about the time-varying parameters of effective population size that reflect levels of genetic diversity across all partitions simultaneously. Our methods are applied to complete genome influenza A sequences that span 13 years. We find that broad peaks and trends, as opposed to seasonal spikes, in the effective population size history distinguish individual segments from the complete genome. We also address hypotheses regarding intersegment dynamics within a formal statistical framework that accounts for correlation between segment-specific parameters. PMID:22160768
The Evolution of Organization Analysis in ASQ, 1959-1979.
ERIC Educational Resources Information Center
Daft, Richard L.
1980-01-01
During the period 1959-1979, a sharp trend toward low-variety statistical languages has taken place, which may represent an organizational mapping phase in which simple, quantifiable relationships have been formally defined and measured. A broader scope of research languages will be needed in the future. (Author/IRT)
Beginning Teacher Induction: A Report on Beginning Teacher Effectiveness and Retention.
ERIC Educational Resources Information Center
Serpell, Zewelanji; Bozeman, Leslie A.
National statistics show a rise in the number of beginning teachers undergoing formal induction in their first year of teaching. This report discusses the effectiveness of induction programs and resulting outcomes for beginning teacher retention, beginning teacher effectiveness, and mentor participation. The various components of induction…
Statistical Knowledge and Learning in Phonology
ERIC Educational Resources Information Center
Dunbar, Ewan Michael
2013-01-01
This dissertation deals with the theory of the phonetic component of grammar in a formal probabilistic inference framework: (1) it has been recognized since the beginning of generative phonology that some language-specific phonetic implementation is actually context-dependent, and thus it can be said that there are gradient "phonetic…
Mathematical Literacy--It's Become Fundamental
ERIC Educational Resources Information Center
McCrone, Sharon Soucy; Dossey, John A.
2007-01-01
The rising tide of numbers and statistics in daily life signals a need for a fundamental broadening of the concept of literacy: mathematical literacy assuming a coequal role in the curriculum alongside language-based literacy. Mathematical literacy is not about studying higher levels of formal mathematics, but about making math relevant and…
ERIC Educational Resources Information Center
Mano, Quintino R.
2016-01-01
Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…
Prison Clinicians' Perceptions of Antisocial Personality Disorder as a Formal Diagnosis.
ERIC Educational Resources Information Center
Stevens, Gail Flint
1994-01-01
Surveyed and interviewed 53 clinicians who work with prison inmates. Results indicated that clinicians used diagnosis of antisocial personality disorder liberally among inmates and felt majority of inmates could be so diagnosed. Large minority of clinicians went beyond Diagnostic and Statistical Manual of Mental Disorders criteria and reported…
Ethical Reasoning Instruction in Non-Ethics Business Courses: A Non-Intrusive Approach
ERIC Educational Resources Information Center
Wilhelm, William J.
2010-01-01
This article discusses four confirmatory studies designed to corroborate findings from prior developmental research which yielded statistically significant improvements in student moral reasoning when specific instructional strategies and content materials were utilized in non-ethics business courses by instructors not formally trained in business…
The Lay Concept of Childhood Mental Disorder
ERIC Educational Resources Information Center
Giummarra, Melita J.; Haslam, Nick
2005-01-01
The structure of lay people's concepts of childhood mental disorder was investigated in a questionnaire study and examined for convergence with the Diagnostic and Statistical Manual (DSM-IV). Eighty-four undergraduates who had no formal education in abnormal psychology rated 54 conditions--36 DSM-IV childhood disorders and 18 non-disorders--on…
From Mere Coincidences to Meaningful Discoveries
ERIC Educational Resources Information Center
Griffiths, Thomas L.; Tenenbaum, Joshua B.
2007-01-01
People's reactions to coincidences are often cited as an illustration of the irrationality of human reasoning about chance. We argue that coincidences may be better understood in terms of rational statistical inference, based on their functional role in processes of causal discovery and theory revision. We present a formal definition of…
Structured Statistical Models of Inductive Reasoning
ERIC Educational Resources Information Center
Kemp, Charles; Tenenbaum, Joshua B.
2009-01-01
Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet…
Evaluating Teachers and Schools Using Student Growth Models
ERIC Educational Resources Information Center
Schafer, William D.; Lissitz, Robert W.; Zhu, Xiaoshu; Zhang, Yuan; Hou, Xiaodong; Li, Ying
2012-01-01
Interest in Student Growth Modeling (SGM) and Value Added Modeling (VAM) arises from educators concerned with measuring the effectiveness of teaching and other school activities through changes in student performance as a companion and perhaps even an alternative to status. Several formal statistical models have been proposed for year-to-year…
(Finite) statistical size effects on compressive strength.
Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien
2014-04-29
The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.
Saito, A; Fujinami, K
2011-02-01
To evaluate the formal debate as an active learning strategy within a postgraduate specialty track education programme in periodontics. A formal debate was implemented as an active learning strategy in the programme. The participants were full-time faculty, residents and dentists attending special courses at a teaching hospital in Japan. They were grouped into two evenly matched opposing teams, judges and audience. As a preparation for the debate, the participants attended a lecture on critical thinking. At the time of debate, each team provided a theme report with a list of references. Performances and contents of the debate were evaluated by the course instructors and audience. Pre- and post-debate testing was used to assess the participants' objective knowledge on clinical periodontology. Evaluation of the debate by the participants revealed that scores for criteria, such as presentation performance, response with logic and rebuttal effectiveness were relatively low. Thirty-eight per cent of the participants demonstrated higher test scores after the debate, although there was no statistically significant difference in the mean scores between pre- and post-tests. At the end of the debate, vast majority of participants recognised the significance and importance of the formal debate in the programme. It was suggested that the incorporation of the formal debate could serve as an educational tool for the postgraduate specialty track programme. © 2011 John Wiley & Sons A/S.
Changes in formal sex education: 1995-2002.
Lindberg, Laura Duberstein; Santelli, John S; Singh, Susheela
2006-12-01
Although comprehensive sex education is broadly supported by health professionals, funding for abstinence-only education has increased. Using data from the 1995 National Survey of Adolescent Males, the 1995 National Survey of Family Growth (NSFG) and the 2002 NSFG, changes in male and female adolescents' reports of the sex education they have received from formal sources were examined. Life-table methods were used to measure the timing of instruction, and t tests were used for changes over time. From 1995 to 2002, reports of formal instruction about birth control methods declined among both genders (males, from 81% to 66%; females, from 87% to 70%). This, combined with increases in reports of abstinence education among males (from 74% to 83%), resulted in a lower proportion of teenagers' overall receiving formal instruction about both abstinence and birth control methods (males, 65% to 59%; females, 84% to 65%), and a higher proportion of teenagers' receiving instruction only about abstinence (males, 9% to 24%; females, 8% to 21%). Teenagers in 2002 had received abstinence education about two years earlier (median age, 11.4 for males, 11.8 for females) than they had received birth control instruction (median age, 13.5 for both males and females). Among sexually experienced adolescents, 62% of females and 54% of males had received instruction about birth control methods prior to first sex. A substantial retreat from formal instruction about birth control methods has left increasing proportions of adolescents receiving only abstinence education. Efforts are needed to expand teenagers' access to medically accurate and comprehensive reproductive health information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less
Armstrong, Kirk J.; Weidner, Thomas G.
2011-01-01
Context: Our previous research determined the frequency of participation and perceived effect of formal and informal continuing education (CE) activities. However, actual preferences for and barriers to CE must be characterized. Objective: To determine the types of formal and informal CE activities preferred by athletic trainers (ATs) and barriers to their participation in these activities. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographically stratified random sample of 1000 ATs, 427 ATs (42.7%) completed the survey. Main Outcome Measure(s): As part of a larger study, the Survey of Formal and Informal Athletic Training Continuing Education Activities (FIATCEA) was developed and administered electronically. The FIATCEA consists of demographic characteristics and Likert scale items (1 = strongly disagree, 5 = strongly agree) about preferred CE activities and barriers to these activities. Internal consistency of survey items, as determined by Cronbach α, was 0.638 for preferred CE activities and 0.860 for barriers to these activities. Descriptive statistics were computed for all items. Differences between respondent demographic characteristics and preferred CE activities and barriers to these activities were determined via analysis of variance and dependent t tests. The α level was set at .05. Results: Hands-on clinical workshops and professional networking were the preferred formal and informal CE activities, respectively. The most frequently reported barriers to formal CE were the cost of attending and travel distance, whereas the most frequently reported barriers to informal CE were personal and job-specific factors. Differences were noted between both the cost of CE and travel distance to CE and all other barriers to CE participation (F1,411 = 233.54, P < .001). Conclusions: Overall, ATs preferred formal CE activities. The same barriers (eg, cost, travel distance) to formal CE appeared to be universal to all ATs. Informal CE was highly valued by ATs because it could be individualized. PMID:22488195
Formal and Informal Continuing Education Activities and Athletic Training Professional Practice
Armstrong, Kirk J.; Weidner, Thomas G.
2010-01-01
Abstract Context: Continuing education (CE) is intended to promote professional growth and, ultimately, to enhance professional practice. Objective: To determine certified athletic trainers' participation in formal (ie, approved for CE credit) and informal (ie, not approved for CE credit) CE activities and the perceived effect these activities have on professional practice with regard to improving knowledge, clinical skills and abilities, attitudes toward patient care, and patient care itself. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographic, stratified random sample of 1000 athletic trainers, 427 (42.7%) completed the survey. Main Outcome Measure(s): The Survey of Formal and Informal Athletic Training Continuing Education Activities was developed and administered electronically. The survey consisted of demographic characteristics and Likert-scale items regarding CE participation and perceived effect of CE on professional practice. Internal consistency of survey items was determined using the Cronbach α (α = 0.945). Descriptive statistics were computed for all items. An analysis of variance and dependent t tests were calculated to determine differences among respondents' demographic characteristics and their participation in, and perceived effect of, CE activities. The α level was set at .05. Results: Respondents completed more informal CE activities than formal CE activities. Participation in informal CE activities included reading athletic training journals (75.4%), whereas formal CE activities included attending a Board of Certification–approved workshop, seminar, or professional conference not conducted by the National Athletic Trainers' Association or affiliates or committees (75.6%). Informal CE activities were perceived to improve clinical skills or abilities and attitudes toward patient care. Formal CE activities were perceived to enhance knowledge. Conclusions: More respondents completed informal CE activities than formal CE activities. Both formal and informal CE activities were perceived to enhance athletic training professional practice. Informal CE activities should be explored and considered for CE credit. PMID:20446842
The Effects of Express Lane Eligibility on Medicaid and CHIP Enrollment among Children
Blavin, Fredric; Kenney, Genevieve M; Huntress, Michael
2014-01-01
Objective To estimate the impact of Express Lane Eligible (ELE) implementation on Medicaid/CHIP enrollment in eight states. Data Sources/Study Setting 2007 to 2011 data from the Statistical Enrollment Data System (SEDS) on Medicaid/CHIP enrollment. Study Design We estimate difference-in-difference equations, with quarter and state fixed effects. The key independent variable is an indicator for whether the state had ELE in place in the given quarter, allowing the experience of statistically matched non-ELE states to serve as a formal counterfactual against which to assess the changes in the eight ELE states. The model also controls for time-varying economic and policy factors within each state. Data Collection/Extraction Methods We obtained SEDS enrollment data from CMS. Principal Findings Across model specifications, the ELE effects on Medicaid enrollment among children were consistently positive, ranging between 4.0 and 7.3 percent, with most estimates statistically significant at the 5 percent level. We also find that ELE increased combined Medicaid/CHIP enrollment. Conclusions Our results imply that ELE has been an effective way for states to increase enrollment and retention among children eligible for Medicaid/CHIP. These results also imply that ELE-like policies could improve take-up of subsidized coverage under the ACA. PMID:24476128
Statistical quantifiers of memory for an analysis of human brain and neuro-system diseases
NASA Astrophysics Data System (ADS)
Demin, S. A.; Yulmetyev, R. M.; Panischev, O. Yu.; Hänggi, Peter
2008-03-01
On the basis of a memory function formalism for correlation functions of time series we investigate statistical memory effects by the use of appropriate spectral and relaxation parameters of measured stochastic data for neuro-system diseases. In particular, we study the dynamics of the walk of a patient who suffers from Parkinson's disease (PD), Huntington's disease (HD), amyotrophic lateral sclerosis (ALS), and compare against the data of healthy people (CO - control group). We employ an analytical method which is able to characterize the stochastic properties of stride-to-stride variations of gait cycle timing. Our results allow us to estimate quantitatively a few human locomotion function abnormalities occurring in the human brain and in the central nervous system (CNS). Particularly, the patient's gait dynamics are characterized by an increased memory behavior together with sizable fluctuations as compared with the locomotion dynamics of healthy patients. Moreover, we complement our findings with peculiar features as detected in phase-space portraits and spectral characteristics for the different data sets (PD, HD, ALS and healthy people). The evaluation of statistical quantifiers of the memory function is shown to provide a useful toolkit which can be put to work to identify various abnormalities of locomotion dynamics. Moreover, it allows one to diagnose qualitatively and quantitatively serious brain and central nervous system diseases.
Regression Models for Identifying Noise Sources in Magnetic Resonance Images
Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.
2009-01-01
Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478
On the analysis of studies of choice
Mullins, Eamonn; Agunwamba, Christian C.; Donohoe, Anthony J.
1982-01-01
In a review of 103 sets of data from 23 different studies of choice, Baum (1979) concluded that whereas undermatching was most commonly observed for responses, the time measure generally conformed to the matching relation. A reexamination of the evidence presented by Baum concludes that undermatching is the most commonly observed finding for both measures. Use of the coefficient of determination by both Baum (1979) and de Villiers (1977) for assessing when matching occurs is criticized on statistical grounds. An alternative to the loss-in-predictability criterion used by Baum (1979) is proposed. This alternative statistic has a simple operational meaning and is related to the usual F-ratio test. It can therefore be used as a formal test of the hypothesis that matching occurs. Baum (1979) also suggests that slope values of between .90 and 1.11 can be considered good approximations to matching. It is argued that the establishment of a fixed interval as a criterion for determining when matching occurs, is inappropriate. A confidence interval based on the data from any given experiment is suggested as a more useful method of assessment. PMID:16812271
NASA Astrophysics Data System (ADS)
Vavylonis, Dimitrios
2009-03-01
I will describe my experience in developing an interdisciplinary biophysics course addressed to students at the upper undergraduate and graduate level, in collaboration with colleagues in physics and biology. The students had a background in physics, biology and engineering, and for many the course was their first exposure to interdisciplinary topics. The course did not depend on a formal knowledge of equilibrium statistical mechanics. Instead, the approach was based on dynamics. I used diffusion as a universal ``long time'' law to illustrate scaling concepts. The importance of statistics and proper counting of states/paths was introduced by calculating the maximum accuracy with which bacteria can measure the concentration of diffuse chemicals. The use of quantitative concepts and methods was introduced through specific biological examples, focusing on model organisms and extremes at the cell level. Examples included microtubule dynamic instability, the search and capture model, molecular motor cooperativity in muscle cells, mitotic spindle oscillations in C. elegans, polymerization forces and propulsion of pathogenic bacteria, Brownian ratchets, bacterial cell division and MinD oscillations.
NASA Astrophysics Data System (ADS)
Andersen, Anders H.; Rayens, William S.; Li, Ren-Cang; Blonder, Lee X.
2000-10-01
In this paper we describe the enormous potential that multilinear models hold for the analysis of data from neuroimaging experiments that rely on functional magnetic resonance imaging (MRI) or other imaging modalities. A case is made for why one might fully expect that the successful introduction of these models to the neuroscience community could define the next generation of structure-seeking paradigms in the area. In spite of the potential for immediate application, there is much to do from the perspective of statistical science. That is, although multilinear models have already been particularly successful in chemistry and psychology, relatively little is known about their statistical properties. To that end, our research group at the University of Kentucky has made significant progress. In particular, we are in the process of developing formal influence measures for multilinear methods as well as associated classification models and effective implementations. We believe that these problems will be among the most important and useful to the scientific community. Details are presented herein and an application is given in the context of facial emotion processing experiments.
Gronewold, Andrew D; Sobsey, Mark D; McMahan, Lanakila
2017-06-01
For the past several years, the compartment bag test (CBT) has been employed in water quality monitoring and public health protection around the world. To date, however, the statistical basis for the design and recommended procedures for enumerating fecal indicator bacteria (FIB) concentrations from CBT results have not been formally documented. Here, we provide that documentation following protocols for communicating the evolution of similar water quality testing procedures. We begin with an overview of the statistical theory behind the CBT, followed by a description of how that theory was applied to determine an optimal CBT design. We then provide recommendations for interpreting CBT results, including procedures for estimating quantiles of the FIB concentration probability distribution, and the confidence of compliance with recognized water quality guidelines. We synthesize these values in custom user-oriented 'look-up' tables similar to those developed for other FIB water quality testing methods. Modified versions of our tables are currently distributed commercially as part of the CBT testing kit. Published by Elsevier B.V.
Mapping proteins in the presence of paralogs using units of coevolution
2013-01-01
Background We study the problem of mapping proteins between two protein families in the presence of paralogs. This problem occurs as a difficult subproblem in coevolution-based computational approaches for protein-protein interaction prediction. Results Similar to prior approaches, our method is based on the idea that coevolution implies equal rates of sequence evolution among the interacting proteins, and we provide a first attempt to quantify this notion in a formal statistical manner. We call the units that are central to this quantification scheme the units of coevolution. A unit consists of two mapped protein pairs and its score quantifies the coevolution of the pairs. This quantification allows us to provide a maximum likelihood formulation of the paralog mapping problem and to cast it into a binary quadratic programming formulation. Conclusion CUPID, our software tool based on a Lagrangian relaxation of this formulation, makes it, for the first time, possible to compute state-of-the-art quality pairings in a few minutes of runtime. In summary, we suggest a novel alternative to the earlier available approaches, which is statistically sound and computationally feasible. PMID:24564758
ERIC Educational Resources Information Center
Rienties, Bart; Hosein, Anesa
2015-01-01
How and with whom academics develop and maintain formal and informal networks for reflecting on their teaching practice has received limited attention even though academic development (AD) programmes have become an almost ubiquitous feature of higher education. The primary goal of this mixed-method study is to unpack how 114 academics in an AD…
Are we getting through? A national survey on the CanMEDS communicator role in urology residency
Roberts, Gregory; Beiko, Darren; Touma, Naji; Siemens, D. Robert
2013-01-01
Introduction: Physician communication skills are paramount to patient satisfaction and are linked to important clinical outcomes. Although well-codified in the Royal College of Physicians and Surgeons of Canada (RCPSC) CanMEDS program, the knowledge, skills, and assessment of communication skills in surgical specialty training are rarely addressed. We assess Canadian urology residents’ experience of and attitudes towards this crucial competency in training and practice. Methods: An anonymous, cross-sectional, self-reported questionnaire was administered to all final year urology residents in Canada from 2 consecutive graduating years (2010 and 2011). A closed-ended 5-point Likert scale was used to assess familiarity with the concept of the RCPSC Communicator role and its application and importance to training and practice. Descriptive and correlative statistics were used to analyze the responses, such as the availability of formal training and resident participation in activities involving health communication. For ease of reporting, an agreement score was created for those responding with “strongly agree” and “agree” on the Likert scale. Results: There was a 100% response rate from the chief residents for both of the 2 years of the survey (n = 58). When questioned about the RCPSC CanMEDS roles, only 45% could identify the correct number of roles, and only 19% could correctly list all 7 roles. However, most residents were well aware of the Communicator role (90% agreement [mean 4.47 ± 0.78]), and most agreed that it plays an important role during training and future practice (83% [4.16 ± 0.84], 90% [4.39 ± 0.84] respectively). This is in stark contrast to perceived formal training. Only 31% (3.00 ± 1.04) agreed that formal training or mentorship in communication was available at their institution, and only 38% (3.14 ± 1.19) felt that communication had been formally addressed during explicit sessions. Despite most of the respondents agreeing they had a significant mentor/role model to emulate regarding communication skills, only 48% believed that faculty frequently addressed communication during clinical learning experiences. Conclusions: Despite knowledge and acceptance of the importance of the Communicator role, there is a perceived lack of formal and explicit training in this essential non-medical expert role of urology residency. It would seem apparent from this needs assessment that there may be an opportunity to coordinate efforts to ensure formal instruction and evaluation in our training programs. PMID:24381664
Exoplanet Biosignatures: Future Directions
Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y.; Lenardic, Adrian; Reinhard, Christopher T.; Moore, William; Schwieterman, Edward W.; Shkolnik, Evgenya L.; Smith, Harrison B.
2018-01-01
Abstract We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets—Biosignatures—Life detection—Bayesian analysis. Astrobiology 18, 779–824. PMID:29938538
Parametric reduced models for the nonlinear Schrödinger equation
NASA Astrophysics Data System (ADS)
Harlim, John; Li, Xiantao
2015-05-01
Reduced models for the (defocusing) nonlinear Schrödinger equation are developed. In particular, we develop reduced models that only involve the low-frequency modes given noisy observations of these modes. The ansatz of the reduced parametric models are obtained by employing a rational approximation and a colored-noise approximation, respectively, on the memory terms and the random noise of a generalized Langevin equation that is derived from the standard Mori-Zwanzig formalism. The parameters in the resulting reduced models are inferred from noisy observations with a recently developed ensemble Kalman filter-based parametrization method. The forecasting skill across different temperature regimes are verified by comparing the moments up to order four, a two-time correlation function statistics, and marginal densities of the coarse-grained variables.
NASA Astrophysics Data System (ADS)
Shakhmuratova, L. N.; Hutchison, W. D.; Isbister, D. J.; Chaplin, D. H.
1997-07-01
A new coherent transient in pulsed NMR, the two-pulse nutational stimulated echo, is reported for the ferromagnetic system 60CoFe using resonant perturbations on the directional emission of anisotropic γ-radiation from thermally oriented nuclei. The new spin echo is a result of non-linear nuclear spin dynamics due to large Larmor inhomogeneity active during radiofrequency pulse application. It is made readily observable through the gross detuning between NMR radiofrequency excitation and gamma radiation detection, and inhomogeneity in the Rabi frequency caused by metallic skin-effect. The method of concatenation of perturbation factors in a statistical tensor formalism is quantitatively applied to successfully predict and then fit in detail the experimental time-domain data.
Parametric reduced models for the nonlinear Schrödinger equation.
Harlim, John; Li, Xiantao
2015-05-01
Reduced models for the (defocusing) nonlinear Schrödinger equation are developed. In particular, we develop reduced models that only involve the low-frequency modes given noisy observations of these modes. The ansatz of the reduced parametric models are obtained by employing a rational approximation and a colored-noise approximation, respectively, on the memory terms and the random noise of a generalized Langevin equation that is derived from the standard Mori-Zwanzig formalism. The parameters in the resulting reduced models are inferred from noisy observations with a recently developed ensemble Kalman filter-based parametrization method. The forecasting skill across different temperature regimes are verified by comparing the moments up to order four, a two-time correlation function statistics, and marginal densities of the coarse-grained variables.
Population forecasts for Bangladesh, using a Bayesian methodology.
Mahsin, Md; Hossain, Syed Shahadat
2012-12-01
Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.
Exoplanet Biosignatures: Future Directions.
Walker, Sara I; Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y; Lenardic, Adrian; Reinhard, Christopher T; Moore, William; Schwieterman, Edward W; Shkolnik, Evgenya L; Smith, Harrison B
2018-06-01
We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets-Biosignatures-Life detection-Bayesian analysis. Astrobiology 18, 779-824.
Su, Hongling; Li, Shengtai
2016-02-03
In this study, we propose two new energy/dissipation-preserving Birkhoffian multi-symplectic methods (Birkhoffian and Birkhoffian box) for Maxwell's equations with dissipation terms. After investigating the non-autonomous and autonomous Birkhoffian formalism for Maxwell's equations with dissipation terms, we first apply a novel generating functional theory to the non-autonomous Birkhoffian formalism to propose our Birkhoffian scheme, and then implement a central box method to the autonomous Birkhoffian formalism to derive the Birkhoffian box scheme. We have obtained four formal local conservation laws and three formal energy global conservation laws. We have also proved that both of our derived schemes preserve the discrete versionmore » of the global/local conservation laws. Furthermore, the stability, dissipation and dispersion relations are also investigated for the schemes. Theoretical analysis shows that the schemes are unconditionally stable, dissipation-preserving for Maxwell's equations in a perfectly matched layer (PML) medium and have second order accuracy in both time and space. Numerical experiments for problems with exact theoretical results are given to demonstrate that the Birkhoffian multi-symplectic schemes are much more accurate in preserving energy than both the exponential finite-difference time-domain (FDTD) method and traditional Hamiltonian scheme. Finally, we also solve the electromagnetic pulse (EMP) propagation problem and the numerical results show that the Birkhoffian scheme recovers the magnitude of the current source and reaction history very well even after long time propagation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Su, Hongling; Li, Shengtai
In this study, we propose two new energy/dissipation-preserving Birkhoffian multi-symplectic methods (Birkhoffian and Birkhoffian box) for Maxwell's equations with dissipation terms. After investigating the non-autonomous and autonomous Birkhoffian formalism for Maxwell's equations with dissipation terms, we first apply a novel generating functional theory to the non-autonomous Birkhoffian formalism to propose our Birkhoffian scheme, and then implement a central box method to the autonomous Birkhoffian formalism to derive the Birkhoffian box scheme. We have obtained four formal local conservation laws and three formal energy global conservation laws. We have also proved that both of our derived schemes preserve the discrete versionmore » of the global/local conservation laws. Furthermore, the stability, dissipation and dispersion relations are also investigated for the schemes. Theoretical analysis shows that the schemes are unconditionally stable, dissipation-preserving for Maxwell's equations in a perfectly matched layer (PML) medium and have second order accuracy in both time and space. Numerical experiments for problems with exact theoretical results are given to demonstrate that the Birkhoffian multi-symplectic schemes are much more accurate in preserving energy than both the exponential finite-difference time-domain (FDTD) method and traditional Hamiltonian scheme. Finally, we also solve the electromagnetic pulse (EMP) propagation problem and the numerical results show that the Birkhoffian scheme recovers the magnitude of the current source and reaction history very well even after long time propagation.« less
A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems
1993-01-01
To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also
Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified
NASA Technical Reports Server (NTRS)
Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.
2005-01-01
Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.
Matching biomedical ontologies based on formal concept analysis.
Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei
2018-03-19
The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.
Putz, Mihai V.
2009-01-01
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467
Putz, Mihai V
2009-11-10
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism
NASA Astrophysics Data System (ADS)
Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor
2018-02-01
Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.
Mann, Michael P.; Rizzardo, Jule; Satkowski, Richard
2004-01-01
Accurate streamflow statistics are essential to water resource agencies involved in both science and decision-making. When long-term streamflow data are lacking at a site, estimation techniques are often employed to generate streamflow statistics. However, procedures for accurately estimating streamflow statistics often are lacking. When estimation procedures are developed, they often are not evaluated properly before being applied. Use of unevaluated or underevaluated flow-statistic estimation techniques can result in improper water-resources decision-making. The California State Water Resources Control Board (SWRCB) uses two key techniques, a modified rational equation and drainage basin area-ratio transfer, to estimate streamflow statistics at ungaged locations. These techniques have been implemented to varying degrees, but have not been formally evaluated. For estimating peak flows at the 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals, the SWRCB uses the U.S. Geological Surveys (USGS) regional peak-flow equations. In this study, done cooperatively by the USGS and SWRCB, the SWRCB estimated several flow statistics at 40 USGS streamflow gaging stations in the north coast region of California. The SWRCB estimates were made without reference to USGS flow data. The USGS used the streamflow data provided by the 40 stations to generate flow statistics that could be compared with SWRCB estimates for accuracy. While some SWRCB estimates compared favorably with USGS statistics, results were subject to varying degrees of error over the region. Flow-based estimation techniques generally performed better than rain-based methods, especially for estimation of December 15 to March 31 mean daily flows. The USGS peak-flow equations also performed well, but tended to underestimate peak flows. The USGS equations performed within reported error bounds, but will require updating in the future as peak-flow data sets grow larger. Little correlation was discovered between estimation errors and geographic locations or various basin characteristics. However, for 25-percentile year mean-daily-flow estimates for December 15 to March 31, the greatest estimation errors were at east San Francisco Bay area stations with mean annual precipitation less than or equal to 30 inches, and estimated 2-year/24-hour rainfall intensity less than 3 inches.
Formal Assurance for Cognitive Architecture Based Autonomous Agent
NASA Technical Reports Server (NTRS)
Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco
2017-01-01
Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.
Statistical manifestation of quantum correlations via disequilibrium
NASA Astrophysics Data System (ADS)
Pennini, F.; Plastino, A.
2017-12-01
The statistical notion of disequilibrium (D) was introduced by López-Ruiz, Mancini, and Calbet (LMC) (1995) [1] more than 20 years ago. D measures the amount of ;correlational structure; of a system. We wish to use D to analyze one of the simplest types of quantum correlations, those present in gaseous systems due to symmetry considerations. To this end we extend the LMC formalism to the grand canonical environment and show that D displays distinctive behaviors for simple gases, that allow for interesting insights into their structural properties.
NASA Astrophysics Data System (ADS)
Berliner, M.
2017-12-01
Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.
2018-01-01
Interpreting, performing and applying research is a key part of evidence-based medical practice, however, incorporating these within curricula is challenging. This study aimed to explore current provision of research skills training within medical school curricula, provide a student-focused needs assessment and prioritise research competencies. A international, cross-sectional survey of final year UK and Irish medical students was disseminated at each participating university. The questionnaire investigated research experience, and confidence in the Medical Education in Europe (MEDINE) 2 consensus survey research competencies. Fully completed responses were received from 521 final year medical students from 32 medical schools (43.4% male, mean age 24.3 years). Of these, 55.3% had an additional academic qualification (49.5% Bachelor's degree), and 38.8% had been a named author on an academic publication. Considering audit and research opportunities and teaching experience, 47.2% reported no formal audit training compared with 27.1% who reported no formal research training. As part of their medical school course, 53.4% had not performed an audit, compared with 29.9% who had not participated in any clinical or basic science research. Nearly a quarter of those who had participated in research reported doing so outside of their medical degree course. Low confidence areas included selecting and performing the appropriate statistical test, selecting the appropriate research method, and critical appraisal. Following adjustment, several factors were associated with increased confidence including previous clinical research experience (OR 4.21, 2.66 to 6.81, P<0.001), additional degrees (OR 2.34, 1.47 to 3.75, P<0.001), and male gender (OR 1.90, 1.25 to 2.09, P=0.003). Factors associated with an increase in perceived opportunities included formal research training in the curriculum (OR 1.66, 1.12 to 2.46, P=0.012), audit skills training in the curriculum (OR 1.52, 1.03 to 2.26, P= 0.036) and research methods taught in a student selected component (OR 1.75, 1.21 to 2.54, P=0.003). Nearly one-third of students lacked formal training on undertaking research, and half of students lacked formal audit training and opportunities to undertake audit as part of their medical school course. The presence of research training in the cirriculum was associated with an increase in perceived opportunities to participate in MEDINE2 research competencies. Female gender and a lack of previous research experience were significant factors influencing confidence and participation in research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M
2017-02-01
Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Gentili, Stefania
2017-04-01
Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.
Statistical nature of infrared dynamics on de Sitter background
NASA Astrophysics Data System (ADS)
Tokuda, Junsei; Tanaka, Takahiro
2018-02-01
In this study, we formulate a systematic way of deriving an effective equation of motion(EoM) for long wavelength modes of a massless scalar field with a general potential V(phi) on de Sitter background, and investigate whether or not the effective EoM can be described as a classical stochastic process. Our formulation gives an extension of the usual stochastic formalism to including sub-leading secular growth coming from the nonlinearity of short wavelength modes. Applying our formalism to λ phi4 theory, we explicitly derive an effective EoM which correctly recovers the next-to-leading secularly growing part at a late time, and show that this effective EoM can be seen as a classical stochastic process. Our extended stochastic formalism can describe all secularly growing terms which appear in all correlation functions with a specific operator ordering. The restriction of the operator ordering will not be a big drawback because the commutator of a light scalar field becomes negligible at large scales owing to the squeezing.
Evaluating the Effectiveness of a Large-Scale Professional Development Programme
ERIC Educational Resources Information Center
Main, Katherine; Pendergast, Donna
2017-01-01
An evaluation of the effectiveness of a large-scale professional development (PD) programme delivered to 258 schools in Queensland, Australia is presented. Formal evaluations were conducted at two stages during the programme using a tool developed from Desimone's five core features of effective PD. Descriptive statistics of 38 questions and…
Making Heads or Tails of Probability: An Experiment with Random Generators
ERIC Educational Resources Information Center
Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie
2013-01-01
Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…
Emergent Feature Structures: Harmony Systems in Exemplar Models of Phonology
ERIC Educational Resources Information Center
Cole, Jennifer
2009-01-01
In exemplar models of phonology, phonotactic constraints are modeled as emergent from patterns of high activation between units that co-occur with statistical regularity, or as patterns of low activation or inhibition between units that co-occur less frequently or not at all. Exemplar models posit no a "priori" formal or representational…
On Testability of Missing Data Mechanisms in Incomplete Data Sets
ERIC Educational Resources Information Center
Raykov, Tenko
2011-01-01
This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…
Integrated Postsecondary Education Data System Data Quality Study. Methodology Report. NCES 2005-175
ERIC Educational Resources Information Center
Jackson, Kenneth W.; Peecksen, Scott; Jang, Donsig; Sukasih, Amang
2005-01-01
The Integrated Postsecondary Education Data System (IPEDS) of the National Center for Education Statistics (NCES) was initiated in 1986 to collect data about all identified institutions whose primary purpose is to provide postsecondary education. Postsecondary education is defined within IPEDS as "the provision of a formal instructional…
The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data
ERIC Educational Resources Information Center
Gregory, Kelvin
2006-01-01
The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…
Bootstrapping in a Language of Thought: A Formal Model of Numerical Concept Learning
ERIC Educational Resources Information Center
Piantadosi, Steven T.; Tenenbaum, Joshua B.; Goodman, Noah D.
2012-01-01
In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful…
Behavioral and Social Science Research: A National Resource. Part II.
ERIC Educational Resources Information Center
Adams, Robert McC., Ed.; And Others
Areas of behavioral and social science research that have achieved significant breakthroughs in knowledge or application or that show future promise of achieving such breakthroughs are discussed in 12 papers. For example, the paper on formal demography shows how mathematical or statistical techniques can be used to explain and predict change in…
Students and Courses 2002: At a Glance. Australian Vocational Education and Training Statistics.
ERIC Educational Resources Information Center
National Centre for Vocational Education Research, Leabrook (Australia).
The public vocational education and training (VET) system in Australia encompasses formal learning activities intended to develop knowledge and skills that are relevant in the workplace for those past the age of compulsory schooling, but excludes bachelor and post-graduate courses and learning for leisure, recreation or personal enrichment. Some…
1973-10-01
intensity computation are shown in Figure 17. Using the same formal procedure outlined by Winne & Wundt . a notch geometry can be chosen to induce...Nitride at Elevated Temperatures . Winne, D.H. and Wundt , B.M., "Application of the Gnffith-Irwm Theory of Crack Propagation to the Bursting Behavior
Beyond Literacy: Non-Formal Education Programmes for Adults in Mozambique
ERIC Educational Resources Information Center
van der Linden, Josje; Manuel, Alzira Munguambe
2011-01-01
Thirty-five years after independence the Mozambican illiteracy rate has been reduced from 93% to just over 50% according to official statistics. Although this indicates an enormous achievement in the area of education, the challenge of today still is to design appropriate adult basic education programmes including literacy, numeracy and life…
Introduction of Digital Storytelling in Preschool Education: A Case Study from Croatia
ERIC Educational Resources Information Center
Preradovic, Nives Mikelic; Lesin, Gordana; Boras, Damir
2016-01-01
Our case study from Croatia showed the benefits of digital storytelling in a preschool as a basis for the formal ICT education. The statistical analysis revealed significant differences between children aged 6-7 who learned mathematics by traditional storytelling compared to those learning through digital storytelling. The experimental group that…
77 FR 72715 - Informal Entry Limit and Removal of a Formal Entry Requirement
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... required certifications, enforcement information, and statistical data. An agency may not conduct or..., 1623, 1624, 3314. * * * * * Sec. 10.1 [Amended] 0 2. In Sec. 10.1: 0 a. Paragraph (a) introductory text... revising``19------'' to read ``20---- --''; 0 c. Paragraph (a)(2) introductory text is amended in the last...
2013-01-01
Background To describe some sociodemographic and educational characteristics of oral health technicians (OHTs) in public primary health care teams in the state of Minas Gerais, Brazil. Methods A cross-sectional descriptive study was performed based on the telephone survey of a representative sample comprising 231 individuals. A pre-tested instrument was used for the data collection, including questions on gender, age in years, years of work as an OHT, years since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. The descriptive statistic was developed and the formation of clusters, by the agglomerative hierarchy technique based on the furthest neighbour, was based on the age, years of work as an OHT, time since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. Results Most interviewees (97.1%) were female. A monthly income of USD 300.00 to 600.00 was reported by 77.5% of the sample. Having educational qualifications in excess of their role was reported by approximately 20% of the participants. The median time since graduation was six years, and half of the sample had worked for four years as an OHT. Most interviewees (67.6%) reported having participated in professional continuing educational programmes. Two different clusters were identified based on the sociodemographic and educational characteristics of the sample. Conclusions The Brazilian OHTs in public primary health care teams in the state of Minas Gerais are mostly female who have had little time since graduation, working experience, and formal schooling sufficient for professional practice. PMID:24365451
Super-sample covariance approximations and partial sky coverage
NASA Astrophysics Data System (ADS)
Lacasa, Fabien; Lima, Marcos; Aguena, Michel
2018-04-01
Super-sample covariance (SSC) is the dominant source of statistical error on large scale structure (LSS) observables for both current and future galaxy surveys. In this work, we concentrate on the SSC of cluster counts, also known as sample variance, which is particularly useful for the self-calibration of the cluster observable-mass relation; our approach can similarly be applied to other observables, such as galaxy clustering and lensing shear. We first examined the accuracy of two analytical approximations proposed in the literature for the flat sky limit, finding that they are accurate at the 15% and 30-35% level, respectively, for covariances of counts in the same redshift bin. We then developed a harmonic expansion formalism that allows for the prediction of SSC in an arbitrary survey mask geometry, such as large sky areas of current and future surveys. We show analytically and numerically that this formalism recovers the full sky and flat sky limits present in the literature. We then present an efficient numerical implementation of the formalism, which allows fast and easy runs of covariance predictions when the survey mask is modified. We applied our method to a mask that is broadly similar to the Dark Energy Survey footprint, finding a non-negligible negative cross-z covariance, i.e. redshift bins are anti-correlated. We also examined the case of data removal from holes due to, for example bright stars, quality cuts, or systematic removals, and find that this does not have noticeable effects on the structure of the SSC matrix, only rescaling its amplitude by the effective survey area. These advances enable analytical covariances of LSS observables to be computed for current and future galaxy surveys, which cover large areas of the sky where the flat sky approximation fails.
The Schrödinger–Langevin equation with and without thermal fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, R., E-mail: roland.katz@subatech.in2p3.fr; Gossiaux, P.B., E-mail: Pol-Bernard.Gossiaux@subatech.in2p3.fr
2016-05-15
The Schrödinger–Langevin equation (SLE) is considered as an effective open quantum system formalism suitable for phenomenological applications involving a quantum subsystem interacting with a thermal bath. We focus on two open issues relative to its solutions: the stationarity of the excited states of the non-interacting subsystem when one considers the dissipation only and the thermal relaxation toward asymptotic distributions with the additional stochastic term. We first show that a proper application of the Madelung/polar transformation of the wave function leads to a non zero damping of the excited states of the quantum subsystem. We then study analytically and numerically themore » SLE ability to bring a quantum subsystem to the thermal equilibrium of statistical mechanics. To do so, concepts about statistical mixed states and quantum noises are discussed and a detailed analysis is carried with two kinds of noise and potential. We show that within our assumptions the use of the SLE as an effective open quantum system formalism is possible and discuss some of its limitations.« less
Random dopant fluctuations and statistical variability in n-channel junctionless FETs
NASA Astrophysics Data System (ADS)
Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.
2018-01-01
The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.
Using the Flipped Classroom to Bridge the Gap to Generation Y
Gillispie, Veronica
2016-01-01
Background: The flipped classroom is a student-centered approach to learning that increases active learning for the student compared to traditional classroom-based instruction. In the flipped classroom model, students are first exposed to the learning material through didactics outside of the classroom, usually in the form of written material, voice-over lectures, or videos. During the formal teaching time, an instructor facilitates student-driven discussion of the material via case scenarios, allowing for complex problem solving, peer interaction, and a deep understanding of the concepts. A successful flipped classroom should have three goals: (1) allow the students to become critical thinkers, (2) fully engage students and instructors, and (3) stimulate the development of a deep understanding of the material. The flipped classroom model includes teaching and learning methods that can appeal to all four generations in the academic environment. Methods: During the 2015 academic year, we implemented the flipped classroom in the obstetrics and gynecology clerkship for the Ochsner Clinical School in New Orleans, LA. Voice-over presentations of the lectures that had been given to students in prior years were recorded and made available to the students through an online classroom. Weekly problem-based learning sessions matched to the subjects of the traditional lectures were held, and the faculty who had previously presented the information in the traditional lecture format facilitated the problem-based learning sessions. The knowledge base of students was evaluated at the end of the rotation via a multiple-choice question examination and the Objective Structured Clinical Examination (OSCE) as had been done in previous years. We compared demographic information and examination scores for traditional teaching and flipped classroom groups of students. The traditional teaching group consisted of students from Rotation 2 and Rotation 3 of the 2014 academic year who received traditional classroom-based instruction. The flipped classroom group consisted of students from Rotation 2 and Rotation 3 of the 2015 academic year who received formal didactics via voice-over presentation and had the weekly problem-based learning sessions. Results: When comparing the students taught by traditional methods to those taught in the flipped classroom model, we saw a statistically significant increase in test scores on the multiple-choice question examination in both the obstetrics and gynecology sections in Rotation 2. While the average score for the flipped classroom group increased in Rotation 3 on the obstetrics section of the multiple-choice question examination, the difference was not statistically significant. Unexpectedly, the average score on the gynecology portion of the multiple-choice question examination decreased among the flipped classroom group compared to the traditional teaching group, and this decrease was statistically significant. For both the obstetrics and the gynecology portions of the OSCE, we saw statistically significant increases in the scores for the flipped classroom group in both Rotation 2 and Rotation 3 compared to the traditional teaching group. With the exception of the gynecology portion of the multiple-choice question examination in Rotation 3, we saw improvement in scores after the implementation of the flipped classroom. Conclusion: The flipped classroom is a feasible and useful alternative to the traditional classroom. It is a method that embraces Generation Y's need for active learning in a group setting while maintaining a traditional classroom method for introducing the information. Active learning increases student engagement and can lead to improved retention of material as demonstrated on standard examinations. PMID:27046401
Le Vu, Stéphane; Ratmann, Oliver; Delpech, Valerie; Brown, Alison E; Gill, O Noel; Tostevin, Anna; Fraser, Christophe; Volz, Erik M
2018-06-01
Phylogenetic clustering of HIV sequences from a random sample of patients can reveal epidemiological transmission patterns, but interpretation is hampered by limited theoretical support and statistical properties of clustering analysis remain poorly understood. Alternatively, source attribution methods allow fitting of HIV transmission models and thereby quantify aspects of disease transmission. A simulation study was conducted to assess error rates of clustering methods for detecting transmission risk factors. We modeled HIV epidemics among men having sex with men and generated phylogenies comparable to those that can be obtained from HIV surveillance data in the UK. Clustering and source attribution approaches were applied to evaluate their ability to identify patient attributes as transmission risk factors. We find that commonly used methods show a misleading association between cluster size or odds of clustering and covariates that are correlated with time since infection, regardless of their influence on transmission. Clustering methods usually have higher error rates and lower sensitivity than source attribution method for identifying transmission risk factors. But neither methods provide robust estimates of transmission risk ratios. Source attribution method can alleviate drawbacks from phylogenetic clustering but formal population genetic modeling may be required to estimate quantitative transmission risk factors. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Refinement for fault-tolerance: An aircraft hand-off protocol
NASA Technical Reports Server (NTRS)
Marzullo, Keith; Schneider, Fred B.; Dehn, Jon
1994-01-01
Part of the Advanced Automation System (AAS) for air-traffic control is a protocol to permit flight hand-off from one air-traffic controller to another. The protocol must be fault-tolerant and, therefore, is subtle -- an ideal candidate for the application of formal methods. This paper describes a formal method for deriving fault-tolerant protocols that is based on refinement and proof outlines. The AAS hand-off protocol was actually derived using this method; that derivation is given.
1992-06-01
system capabilities \\Jch as memory management and network communications are provided by a virtual machine-type operating environment. Various human ...thinking. The elements of this substrate include representational formality, genericity, a method of formal analysis, and augmentation of human analytical...the form of identifying: the data entity itself; its aliases (including how the data is presented th programs or human users in the form of copy
Néri, Eugenie Desirèe Rabelo; Meira, Assuero Silva; Vasconcelos, Hemerson Bruno da Silva; Woods, David John; Fonteles, Marta Maria de França
2017-01-01
This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Pharmacists had 1-4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nunes, Rafael C.; Abreu, Everton M.C.; Neto, Jorge Ananias
Based on the relationship between thermodynamics and gravity we propose, with the aid of Verlinde's formalism, an alternative interpretation of the dynamical evolution of the Friedmann-Robertson-Walker Universe. This description takes into account the entropy and temperature intrinsic to the horizon of the universe due to the information holographically stored there through non-gaussian statistical theories proposed by Tsallis and Kaniadakis. The effect of these non-gaussian statistics in the cosmological context is to change the strength of the gravitational constant. In this paper, we consider the w CDM model modified by the non-gaussian statistics and investigate the compatibility of these non-gaussian modificationmore » with the cosmological observations. In order to analyze in which extend the cosmological data constrain these non-extensive statistics, we will use type Ia supernovae, baryon acoustic oscillations, Hubble expansion rate function and the linear growth of matter density perturbations data. We show that Tsallis' statistics is favored at 1σ confidence level.« less
Data-based Non-Markovian Model Inference
NASA Astrophysics Data System (ADS)
Ghil, Michael
2015-04-01
This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close collaboration with M.D. Chekroun, D. Kondrashov, S. Kravtsov and A.W. Robertson.
Uncertainty and inference in the world of paleoecological data
NASA Astrophysics Data System (ADS)
McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.
2017-12-01
Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a coarseness in statistical models of taphonomic process. Each of these features provides useful opportunities for statisticians and data-generating researchers to assess what we know about the signal and the noise in paleo data and to improve inference about past changes in ecosystem state.
NASA Astrophysics Data System (ADS)
Bandoro, Justin; Solomon, Susan; Santer, Benjamin D.; Kinnison, Douglas E.; Mills, Michael J.
2018-01-01
We perform a formal attribution study of upper- and lower-stratospheric ozone changes using observations together with simulations from the Whole Atmosphere Community Climate Model. Historical model simulations were used to estimate the zonal-mean response patterns (fingerprints
) to combined forcing by ozone-depleting substances (ODSs) and well-mixed greenhouse gases (GHGs), as well as to the individual forcing by each factor. Trends in the similarity between the searched-for fingerprints and homogenized observations of stratospheric ozone were compared to trends in pattern similarity between the fingerprints and the internally and naturally generated variability inferred from long control runs. This yields estimated signal-to-noise (S/N) ratios for each of the three fingerprints (ODS, GHG, and ODS + GHG). In both the upper stratosphere (defined in this paper as 1 to 10 hPa) and lower stratosphere (40 to 100 hPa), the spatial fingerprints of the ODS + GHG and ODS-only patterns were consistently detectable not only during the era of maximum ozone depletion but also throughout the observational record (1984-2016). We also develop a fingerprint attribution method to account for forcings whose time evolutions are markedly nonlinear over the observational record. When the nonlinearity of the time evolution of the ODS and ODS + GHG signals is accounted for, we find that the S/N ratios obtained with the stratospheric ODS and ODS + GHG fingerprints are enhanced relative to standard linear trend analysis. Use of the nonlinear signal detection method also reduces the detection time - the estimate of the date at which ODS and GHG impacts on ozone can be formally identified. Furthermore, by explicitly considering nonlinear signal evolution, the complete observational record can be used in the S/N analysis, without applying piecewise linear regression and introducing arbitrary break points. The GHG-driven fingerprint of ozone changes was not statistically identifiable in either the upper- or lower-stratospheric SWOOSH data, irrespective of the signal detection method used. In the WACCM simulations of future climate change, the GHG signal is statistically identifiable between 2020 and 2030. Our findings demonstrate the importance of continued stratospheric ozone monitoring to improve estimates of the contributions of ODS and GHG forcing to global changes in stratospheric ozone.
NASA Astrophysics Data System (ADS)
Chen, Youlin; Xie, Jiakang
2017-07-01
We address two fundamental issues that pertain to Q tomography using high-frequency regional waves, particularly the Lg wave. The first issue is that Q tomography uses complex 'reduced amplitude data' as input. These data are generated by taking the logarithm of the product of (1) the observed amplitudes and (2) the simplified 1D geometrical spreading correction. They are thereby subject to 'modeling errors' that are dominated by uncompensated 3D structural effects; however, no knowledge of the statistical behaviour of these errors exists to justify the widely used least-squares methods for solving Q tomography. The second issue is that Q tomography has been solved using various iterative methods such as LSQR (Least-Squares QR, where QR refers to a QR factorization of a matrix into the product of an orthogonal matrix Q and an upper triangular matrix R) and SIRT (Simultaneous Iterative Reconstruction Technique) that do not allow for the quantitative estimation of model resolution and error. In this study, we conduct the first rigorous analysis of the statistics of the reduced amplitude data and find that the data error distribution is predominantly normal, but with long-tailed outliers. This distribution is similar to that of teleseismic traveltime residuals. We develop a screening procedure to remove outliers so that data closely follow a normal distribution. Next, we develop an efficient tomographic method based on the PROPACK software package to perform singular value decomposition on a data kernel matrix, which enables us to solve for the inverse, model resolution and covariance matrices along with the optimal Q model. These matrices permit for various quantitative model appraisals, including the evaluation of the formal resolution and error. Further, they allow formal uncertainty estimates of predicted data (Q) along future paths to be made at any specified confidence level. This new capability significantly benefits the practical missions of source identification and source size estimation, for which reliable uncertainty estimates are especially important. We apply the new methodologies to data from southeastern China to obtain a 1 Hz Lg Q model, which exhibits patterns consistent with what is known about the geology and tectonics of the region. We also solve for the site response model.
A 3D generic inverse dynamic method using wrench notation and quaternion algebra.
Dumas, R; Aissaoui, R; de Guise, J A
2004-06-01
In the literature, conventional 3D inverse dynamic models are limited in three aspects related to inverse dynamic notation, body segment parameters and kinematic formalism. First, conventional notation yields separate computations of the forces and moments with successive coordinate system transformations. Secondly, the way conventional body segment parameters are defined is based on the assumption that the inertia tensor is principal and the centre of mass is located between the proximal and distal ends. Thirdly, the conventional kinematic formalism uses Euler or Cardanic angles that are sequence-dependent and suffer from singularities. In order to overcome these limitations, this paper presents a new generic method for inverse dynamics. This generic method is based on wrench notation for inverse dynamics, a general definition of body segment parameters and quaternion algebra for the kinematic formalism.
Transformation of general binary MRF minimization to the first-order case.
Ishikawa, Hiroshi
2011-06-01
We introduce a transformation of general higher-order Markov random field with binary labels into a first-order one that has the same minima as the original. Moreover, we formalize a framework for approximately minimizing higher-order multi-label MRF energies that combines the new reduction with the fusion-move and QPBO algorithms. While many computer vision problems today are formulated as energy minimization problems, they have mostly been limited to using first-order energies, which consist of unary and pairwise clique potentials, with a few exceptions that consider triples. This is because of the lack of efficient algorithms to optimize energies with higher-order interactions. Our algorithm challenges this restriction that limits the representational power of the models so that higher-order energies can be used to capture the rich statistics of natural scenes. We also show that some minimization methods can be considered special cases of the present framework, as well as comparing the new method experimentally with other such techniques.
Back to Normal! Gaussianizing posterior distributions for cosmological probes
NASA Astrophysics Data System (ADS)
Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.
2014-05-01
We present a method to map multivariate non-Gaussian posterior probability densities into Gaussian ones via nonlinear Box-Cox transformations, and generalizations thereof. This is analogous to the search for normal parameters in the CMB, but can in principle be applied to any probability density that is continuous and unimodal. The search for the optimally Gaussianizing transformation amongst the Box-Cox family is performed via a maximum likelihood formalism. We can judge the quality of the found transformation a posteriori: qualitatively via statistical tests of Gaussianity, and more illustratively by how well it reproduces the credible regions. The method permits an analytical reconstruction of the posterior from a sample, e.g. a Markov chain, and simplifies the subsequent joint analysis with other experiments. Furthermore, it permits the characterization of a non-Gaussian posterior in a compact and efficient way. The expression for the non-Gaussian posterior can be employed to find analytic formulae for the Bayesian evidence, and consequently be used for model comparison.
Modeling, simulation, and estimation of optical turbulence
NASA Astrophysics Data System (ADS)
Formwalt, Byron Paul
This dissertation documents three new contributions to simulation and modeling of optical turbulence. The first contribution is the formalization, optimization, and validation of a modeling technique called successively conditioned rendering (SCR). The SCR technique is empirically validated by comparing the statistical error of random phase screens generated with the technique. The second contribution is the derivation of the covariance delineation theorem, which provides theoretical bounds on the error associated with SCR. It is shown empirically that the theoretical bound may be used to predict relative algorithm performance. Therefore, the covariance delineation theorem is a powerful tool for optimizing SCR algorithms. For the third contribution, we introduce a new method for passively estimating optical turbulence parameters, and demonstrate the method using experimental data. The technique was demonstrated experimentally, using a 100 m horizontal path at 1.25 m above sun-heated tarmac on a clear afternoon. For this experiment, we estimated C2n ≈ 6.01 · 10-9 m-23 , l0 ≈ 17.9 mm, and L0 ≈ 15.5 m.
Formalizing Evaluation Procedures for Marketing Faculty Research Performance.
ERIC Educational Resources Information Center
McDermott, Dennis R.; And Others
1994-01-01
Results of a national survey of marketing department heads (n=142) indicate that few marketing departments have formalized the development and communication of research performance standards to faculty. Guidelines and methods to accomplish those procedures most efficiently were proposed. (Author/JOW)
NASA software specification and evaluation system design, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.
Applying dynamic Bayesian networks to perturbed gene expression data.
Dojer, Norbert; Gambin, Anna; Mizera, Andrzej; Wilczyński, Bartek; Tiuryn, Jerzy
2006-05-08
A central goal of molecular biology is to understand the regulatory mechanisms of gene transcription and protein synthesis. Because of their solid basis in statistics, allowing to deal with the stochastic aspects of gene expressions and noisy measurements in a natural way, Bayesian networks appear attractive in the field of inferring gene interactions structure from microarray experiments data. However, the basic formalism has some disadvantages, e.g. it is sometimes hard to distinguish between the origin and the target of an interaction. Two kinds of microarray experiments yield data particularly rich in information regarding the direction of interactions: time series and perturbation experiments. In order to correctly handle them, the basic formalism must be modified. For example, dynamic Bayesian networks (DBN) apply to time series microarray data. To our knowledge the DBN technique has not been applied in the context of perturbation experiments. We extend the framework of dynamic Bayesian networks in order to incorporate perturbations. Moreover, an exact algorithm for inferring an optimal network is proposed and a discretization method specialized for time series data from perturbation experiments is introduced. We apply our procedure to realistic simulations data. The results are compared with those obtained by standard DBN learning techniques. Moreover, the advantages of using exact learning algorithm instead of heuristic methods are analyzed. We show that the quality of inferred networks dramatically improves when using data from perturbation experiments. We also conclude that the exact algorithm should be used when it is possible, i.e. when considered set of genes is small enough.
ERIC Educational Resources Information Center
Miloševic Zupancic, Vesna
2018-01-01
Research from the field of non-formal education (NFE) in youth work emphasises the central role of experiential learning and learning in groups. The present paper aims to research teaching methods and teaching forms in NFE in youth work. The research sought to answer the following research questions: 'What teaching forms can be found in NFE for…
Products of composite operators in the exact renormalization group formalism
NASA Astrophysics Data System (ADS)
Pagani, C.; Sonoda, H.
2018-02-01
We discuss a general method of constructing the products of composite operators using the exact renormalization group formalism. Considering mainly the Wilson action at a generic fixed point of the renormalization group, we give an argument for the validity of short-distance expansions of operator products. We show how to compute the expansion coefficients by solving differential equations, and test our method with some simple examples.
Fourth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)
1997-01-01
This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.
The equivalence of Darmois-Israel and distributional method for thin shells in general relativity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mansouri, R.; Khorrami, M.
1996-11-01
A distributional method to solve the Einstein{close_quote}s field equations for thin shells is formulated. The familiar field equations and jump conditions of Darmois-Israel formalism are derived. A careful analysis of the Bianchi identities shows that, for cases under consideration, they make sense as distributions and lead to jump conditions of Darmois-Israel formalism. {copyright} {ital 1996 American Institute of Physics.}
Discrete mathematics, formal methods, the Z schema and the software life cycle
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
The proper role and scope for the use of discrete mathematics and formal methods in support of engineering the security and integrity of components within deployed computer systems are discussed. It is proposed that the Z schema can be used as the specification language to capture the precise definition of system and component interfaces. This can be accomplished with an object oriented development paradigm.
Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro
2016-06-01
In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high.
A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)
NASA Astrophysics Data System (ADS)
High, Wayne
1993-03-01
This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.
NASA Astrophysics Data System (ADS)
Uhlemann, C.; Pajer, E.; Pichon, C.; Nishimichi, T.; Codis, S.; Bernardeau, F.
2018-03-01
Non-Gaussianities of dynamical origin are disentangled from primordial ones using the formalism of large deviation statistics with spherical collapse dynamics. This is achieved by relying on accurate analytical predictions for the one-point probability distribution function and the two-point clustering of spherically averaged cosmic densities (sphere bias). Sphere bias extends the idea of halo bias to intermediate density environments and voids as underdense regions. In the presence of primordial non-Gaussianity, sphere bias displays a strong scale dependence relevant for both high- and low-density regions, which is predicted analytically. The statistics of densities in spheres are built to model primordial non-Gaussianity via an initial skewness with a scale dependence that depends on the bispectrum of the underlying model. The analytical formulas with the measured non-linear dark matter variance as input are successfully tested against numerical simulations. For local non-Gaussianity with a range from fNL = -100 to +100, they are found to agree within 2 per cent or better for densities ρ ∈ [0.5, 3] in spheres of radius 15 Mpc h-1 down to z = 0.35. The validity of the large deviation statistics formalism is thereby established for all observationally relevant local-type departures from perfectly Gaussian initial conditions. The corresponding estimators for the amplitude of the non-linear variance σ8 and primordial skewness fNL are validated using a fiducial joint maximum likelihood experiment. The influence of observational effects and the prospects for a future detection of primordial non-Gaussianity from joint one- and two-point densities-in-spheres statistics are discussed.
Statistical mechanics of shell models for two-dimensional turbulence
NASA Astrophysics Data System (ADS)
Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.
1994-12-01
We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.
Acquisition of Formal Operations: The Effects of Two Training Procedures.
ERIC Educational Resources Information Center
Rosenthal, Doreen A.
1979-01-01
A study of 11- and 12-year-old girls indicates that either of two training procedures, method training or dimension training, can aid in the transition from concrete operational to formal operational thought by promoting a hypothesis-testing attitude. (BH)
Towards Formal Implementation of PUS Standard
NASA Astrophysics Data System (ADS)
Ilić, D.
2009-05-01
As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.
An Introduction to Distributions Using Weighted Dice
ERIC Educational Resources Information Center
Holland, Bart K.
2011-01-01
Distributions are the basis for an enormous amount of theoretical and applied work in statistics. While there are formal definitions of distributions and many formulas to characterize them, it is important that students at first get a clear introduction to this basic concept. For many of them, neither words nor formulas can match the power of a…
Uncertainty in eddy covariance measurements and its application to physiological models
D.Y. Hollinger; A.D. Richardson; A.D. Richardson
2005-01-01
Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...
How Can We Enhance Enjoyment of Secondary School? The Student View
ERIC Educational Resources Information Center
Gorard, Stephen; See, Beng Huat
2011-01-01
This paper considers enjoyment of formal education for young people aged 14 to 16, largely from their own perspective, based on the view of around 3000 students in England. The data include documentary analysis, official statistics, interviews and surveys with staff and students. Enjoyment of school tends to be promoted by factors such as…
Empirical and Genealogical Analysis of Non-Vocational Adult Education in Europe
ERIC Educational Resources Information Center
Manninen, Jyri
2017-01-01
Non-formal, non-vocational adult education (NFNVAE) is a low-cost, low-threshold learning activity that generates many benefits for individuals and society, and it should play a more central role in educational policy. NFNVAE's challenge is that it lacks clear concepts and definitions and is, therefore, less systematically covered in statistics,…
Reliability Considerations for the Operation of Large Accelerator User Facilities
Willeke, F. J.
2016-01-29
The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. Finally, the article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.
ERIC Educational Resources Information Center
Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.
2008-01-01
In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…
ERIC Educational Resources Information Center
Aderibigbe, Semiyu Adejare; Ajasa, Folorunso Adekemi
2013-01-01
Purpose: The purpose of this paper is to explore the perceptions of college tutors on peer coaching as a tool for professional development to determine its formal institutionalisation. Design/methodology/approach: A survey questionnaire was used for data collection, while analysis of data was done using descriptive statistics. Findings: The…
Open Educational Resources: A Faculty Author's Perspective
ERIC Educational Resources Information Center
Illowsky, Barbara
2012-01-01
As the coauthor (with Susan Dean) of a formally for-profit and now open (i.e., free on the web) textbook, "Collaborative Statistics," this author has received many questions about open educational resources (OER), which can be summarized as follows: (1) What are OER?; (2) Why do you support, actively promote, and speak about OER?; (3) If a book is…
76 FR 30306 - New England Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-25
...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee on June 14-15, 2011 to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.
76 FR 43266 - New England Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee, on August 9-10, 2011, to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.
Standing by Their Principles: Two Librarians Who Faced Challenges
ERIC Educational Resources Information Center
Adams, Helen; Leu, DaNae; Venuto, Dee Ann
2015-01-01
What do school librarians fear most? Hands down, their biggest fear is a formal challenge to a resource in the school library. There are no accurate statistics about the number of challenges to school library resources. The staff of ALA's Office for Intellectual Freedom estimates that only about 20 percent are reported to ALA annually. For the…