Non-equilibrium dog-flea model
NASA Astrophysics Data System (ADS)
Ackerson, Bruce J.
2017-11-01
We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.
Philosophy and the practice of Bayesian statistics
Gelman, Andrew; Shalizi, Cosma Rohilla
2015-01-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575
Philosophy and the practice of Bayesian statistics.
Gelman, Andrew; Shalizi, Cosma Rohilla
2013-02-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.
Diagnosis checking of statistical analysis in RCTs indexed in PubMed.
Lee, Paul H; Tse, Andy C Y
2017-11-01
Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J
2014-01-01
Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.
Ernst, Anja F; Albers, Casper J
2017-01-01
Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.
Ernst, Anja F.
2017-01-01
Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971
A generalized statistical model for the size distribution of wealth
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2012-12-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.
Use of check lists in assessing the statistical content of medical studies.
Gardner, M J; Machin, D; Campbell, M J
1986-01-01
Two check lists are used routinely in the statistical assessment of manuscripts submitted to the "BMJ." One is for papers of a general nature and the other specifically for reports on clinical trials. Each check list includes questions on the design, conduct, analysis, and presentation of studies, and answers to these contribute to the overall statistical evaluation. Only a small proportion of submitted papers are assessed statistically, and these are selected at the refereeing or editorial stage. Examination of the use of the check lists showed that most papers contained statistical failings, many of which could easily be remedied. It is recommended that the check lists should be used by statistical referees, editorial staff, and authors and also during the design stage of studies. PMID:3082452
Approximate Model Checking of PCTL Involving Unbounded Path Properties
NASA Astrophysics Data System (ADS)
Basu, Samik; Ghosh, Arka P.; He, Ru
We study the problem of applying statistical methods for approximate model checking of probabilistic systems against properties encoded as
NASA Astrophysics Data System (ADS)
Melendez, Jordan; Wesolowski, Sarah; Furnstahl, Dick
2017-09-01
Chiral effective field theory (EFT) predictions are necessarily truncated at some order in the EFT expansion, which induces an error that must be quantified for robust statistical comparisons to experiment. A Bayesian model yields posterior probability distribution functions for these errors based on expectations of naturalness encoded in Bayesian priors and the observed order-by-order convergence pattern of the EFT. As a general example of a statistical approach to truncation errors, the model was applied to chiral EFT for neutron-proton scattering using various semi-local potentials of Epelbaum, Krebs, and Meißner (EKM). Here we discuss how our model can learn correlation information from the data and how to perform Bayesian model checking to validate that the EFT is working as advertised. Supported in part by NSF PHY-1614460 and DOE NUCLEI SciDAC DE-SC0008533.
Teaching "Instant Experience" with Graphical Model Validation Techniques
ERIC Educational Resources Information Center
Ekstrøm, Claus Thorn
2014-01-01
Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.
Model selection and assessment for multi-species occupancy models
Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.
2016-01-01
While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.
Statistical mechanics of broadcast channels using low-density parity-check codes.
Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David
2003-03-01
We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.
León, Larry F; Cai, Tianxi
2012-04-01
In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.
Validation of a heteroscedastic hazards regression model.
Wu, Hong-Dar Isaac; Hsieh, Fushing; Chen, Chen-Hsin
2002-03-01
A Cox-type regression model accommodating heteroscedasticity, with a power factor of the baseline cumulative hazard, is investigated for analyzing data with crossing hazards behavior. Since the approach of partial likelihood cannot eliminate the baseline hazard, an overidentified estimating equation (OEE) approach is introduced in the estimation procedure. It by-product, a model checking statistic, is presented to test for the overall adequacy of the heteroscedastic model. Further, under the heteroscedastic model setting, we propose two statistics to test the proportional hazards assumption. Implementation of this model is illustrated in a data analysis of a cancer clinical trial.
Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?
Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie
2012-01-01
A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746
Bayesian models based on test statistics for multiple hypothesis testing problems.
Ji, Yuan; Lu, Yiling; Mills, Gordon B
2008-04-01
We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... documentation, if not completed in any previous investigation; a check of Bureau of Vital Statistics records when any discrepancy is found to exist. (5) Local Agency Checks: As a minimum, all investigations will...
Code of Federal Regulations, 2011 CFR
2011-07-01
... documentation, if not completed in any previous investigation; a check of Bureau of Vital Statistics records when any discrepancy is found to exist. (5) Local Agency Checks: As a minimum, all investigations will...
Code of Federal Regulations, 2010 CFR
2010-07-01
... documentation, if not completed in any previous investigation; a check of Bureau of Vital Statistics records when any discrepancy is found to exist. (5) Local Agency Checks: As a minimum, all investigations will...
Code of Federal Regulations, 2014 CFR
2014-07-01
... documentation, if not completed in any previous investigation; a check of Bureau of Vital Statistics records when any discrepancy is found to exist. (5) Local Agency Checks: As a minimum, all investigations will...
Code of Federal Regulations, 2013 CFR
2013-07-01
... documentation, if not completed in any previous investigation; a check of Bureau of Vital Statistics records when any discrepancy is found to exist. (5) Local Agency Checks: As a minimum, all investigations will...
Normality Tests for Statistical Analysis: A Guide for Non-Statisticians
Ghasemi, Asghar; Zahediasl, Saleh
2012-01-01
Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808
Ren, Y Y; Zhou, L C; Yang, L; Liu, P Y; Zhao, B W; Liu, H X
2016-09-01
The paper highlights the use of the logistic regression (LR) method in the construction of acceptable statistically significant, robust and predictive models for the classification of chemicals according to their aquatic toxic modes of action. Essentials accounting for a reliable model were all considered carefully. The model predictors were selected by stepwise forward discriminant analysis (LDA) from a combined pool of experimental data and chemical structure-based descriptors calculated by the CODESSA and DRAGON software packages. Model predictive ability was validated both internally and externally. The applicability domain was checked by the leverage approach to verify prediction reliability. The obtained models are simple and easy to interpret. In general, LR performs much better than LDA and seems to be more attractive for the prediction of the more toxic compounds, i.e. compounds that exhibit excess toxicity versus non-polar narcotic compounds and more reactive compounds versus less reactive compounds. In addition, model fit and regression diagnostics was done through the influence plot which reflects the hat-values, studentized residuals, and Cook's distance statistics of each sample. Overdispersion was also checked for the LR model. The relationships between the descriptors and the aquatic toxic behaviour of compounds are also discussed.
Kachani, Adriana Trejger; Barroso, Lucia Pereira; Brasiliano, Silvia; Cordás, Táki Athanássios; Hochgraf, Patrícia Brunfentrinker
2015-12-01
Compare inadequate eating behaviors and their relationship to body checking in three groups: patients with anorexia nervosa (AN), patients with bulimia nervosa (BN) and a control group (C). Eighty three outpatients with eating disorders (ED) and 40 controls completed eating attitudes and body checking questionnaires. The overall relationship between the eating attitude and body checking was statistically significant in all three groups. The worse the eating attitude, the greater the body checking behavior. However, when we look at each group individually, the relationship was only statistically significant in the AN group (r=.354, p=0.020). The lower the desired weight and the worse the eating attitude, the more people check themselves, although in the presence of an ED the relationship between body checking and food restrictions is greater. In patients displaying the AN subgroup, body checking is also related to continued dietary control. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Sinharay, Sandip; Almond, Russell; Yan, Duanli
2004-01-01
Model checking is a crucial part of any statistical analysis. As educators tie models for testing to cognitive theory of the domains, there is a natural tendency to represent participant proficiencies with latent variables representing the presence or absence of the knowledge, skills, and proficiencies to be tested (Mislevy, Almond, Yan, &…
Antimicrobial activity of root canal irrigants against biofilm forming pathogens- An in vitro study
Ghivari, Sheetal Basavraj; Bhattacharya, Haimanti; Bhat, Kishore G.; Pujar, Madhu A.
2017-01-01
Aims: The aim of the study was to check the antimicrobial activity of the 5% Sodium hypochlorite, 2% Chlorhexidine, 0.10% Octenidine (OCT), and 2% Silver Zeolite (SZ) at different time intervals against a single species biofilm of Enterococcus faecalis, Staphylococcus aureus, and Candida albicans model prepared on a nitrocellulose membrane. Settings and Design: In vitro nitrocellulose biofilm model was used to check antibacterial efficacy of root canal irrigants. Materials and Methods: The in vitro nitrocellulose biofilm model was used to check the antibacterial activity of root canal irrigants. Single species biofilms were suspended into 96-well microtiter plate and treated with root canal irrigants for 1, 5, 10, 15, 30, and 60 s, respectively. The remaining microbial load in the form of colony-forming unit/ml after antimicrobial treatment was tabulated and data were statistically analyzed. Statistical Analysis: SPSS version 17, Kruskal–Wallis ANOVA, Mann–Whitney U-test, and Wilcoxon matched pair test (P < 0.05) were used. Results: All tested microorganisms were eliminated within 30 s by all the antimicrobial substances tested except normal saline. 2% chlorhexidine and 0.10% OCT were equally effective against C. albicans at 30 s. Conclusion: The newly tested irrigants have shown considerable antibacterial activity against selected single species biofilm. OCT (0.10%) can be used as an alternative endodontic irrigant. PMID:29279615
Model-checking techniques based on cumulative residuals.
Lin, D Y; Wei, L J; Ying, Z
2002-03-01
Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.
A Categorization of Dynamic Analyzers
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1997-01-01
Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.
Pitfalls in statistical landslide susceptibility modelling
NASA Astrophysics Data System (ADS)
Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut
2010-05-01
The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible lack of explanatory information in the chosen set of predictor variables, the model residuals need to be checked for spatial auto¬correlation. Therefore, we calculate spline correlograms. In addition to this, we investigate partial dependency plots and bivariate interactions plots considering possible interactions between predictors to improve model interpretation. Aiming at presenting this toolbox for model quality assessment, we investigate the influence of strategies in the construction of training datasets for statistical models on model quality.
Fall 2014 SEI Research Review Probabilistic Analysis of Time Sensitive Systems
2014-10-28
Osmosis SMC Tool Osmosis is a tool for Statistical Model Checking (SMC) with Semantic Importance Sampling. • Input model is written in subset of C...ASSERT() statements in model indicate conditions that must hold. • Input probability distributions defined by the user. • Osmosis returns the...on: – Target relative error, or – Set number of simulations Osmosis Main Algorithm 1 http://dreal.cs.cmu.edu/ (?⃑?): Indicator
Ng'andu, N H
1997-03-30
In the analysis of survival data using the Cox proportional hazard (PH) model, it is important to verify that the explanatory variables analysed satisfy the proportional hazard assumption of the model. This paper presents results of a simulation study that compares five test statistics to check the proportional hazard assumption of Cox's model. The test statistics were evaluated under proportional hazards and the following types of departures from the proportional hazard assumption: increasing relative hazards; decreasing relative hazards; crossing hazards; diverging hazards, and non-monotonic hazards. The test statistics compared include those based on partitioning of failure time and those that do not require partitioning of failure time. The simulation results demonstrate that the time-dependent covariate test, the weighted residuals score test and the linear correlation test have equally good power for detection of non-proportionality in the varieties of non-proportional hazards studied. Using illustrative data from the literature, these test statistics performed similarly.
Vannucci, Jacopo; Bellezza, Guido; Matricardi, Alberto; Moretti, Giulia; Bufalari, Antonello; Cagini, Lucio; Puma, Francesco; Daddi, Niccolò
2018-01-01
Talc pleurodesis has been associated with pleuropulmonary damage, particularly long-term damage due to its inert nature. The present model series review aimed to assess the safety of this procedure by examining inflammatory stimulus, biocompatibility and tissue reaction following talc pleurodesis. Talc slurry was performed in rabbits: 200 mg/kg checked at postoperative day 14 (five models), 200 mg/kg checked at postoperative day 28 (five models), 40 mg/kg, checked at postoperative day 14 (five models), 40 mg/kg checked at postoperative day 28 (five models). Talc poudrage was performed in pigs: 55 mg/kg checked at postoperative day 60 (18 models). Tissue inspection and data collection followed the surgical pathology approach currently used in clinical practice. As this was an observational study, no statistical analysis was performed. Regarding the rabbit model (Oryctolagus cunicoli), the extent of adhesions ranged between 0 and 30%, and between 0 and 10% following 14 and 28 days, respectively. No intraparenchymal granuloma was observed whereas, pleural granulomas were extensively encountered following both talc dosages, with more evidence of visceral pleura granulomas following 200 mg/kg compared with 40 mg/kg. Severe florid inflammation was observed in 2/10 cases following 40 mg/kg. Parathymic, pericardium granulomas and mediastinal lymphadenopathy were evidenced at 28 days. At 60 days, from rare adhesions to extended pleurodesis were observed in the pig model (Sus Scrofa domesticus). Pleural granulomas were ubiquitous on visceral and parietal pleurae. Severe spotted inflammation among the adhesions were recorded in 15/18 pigs. Intraparenchymal granulomas were observed in 9/18 lungs. Talc produced unpredictable pleurodesis in both animal models with enduring pleural inflammation whether it was performed via slurry or poudrage. Furthermore, talc appeared to have triggered extended pleural damage, intraparenchymal nodules (porcine poudrage) and mediastinal migration (rabbit slurry). PMID:29403549
NASA Technical Reports Server (NTRS)
1981-01-01
The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.
Physical concepts in the development of constitutive equations
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1985-01-01
Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.
An Adaptive Buddy Check for Observational Quality Control
NASA Technical Reports Server (NTRS)
Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)
2000-01-01
An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.
NASA Technical Reports Server (NTRS)
Tijidjian, Raffi P.
2010-01-01
The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Commentary: Gene by Environment Interplay and Psychopathology--In Search of a Paradigm
ERIC Educational Resources Information Center
Nigg, Joel T.
2013-01-01
The articles in this Special Issue (SI) extend research on G×E in multiple ways, showing the growing importance of specifying kinds of G×E models (e.g., bioecological, susceptibility, stress-diathesis), incorporation of sophisticated ways of measuring types of G×E correlations (rGE), checking effects of statistical artifact, exemplifying an…
Colombini, D; Occhipinti, E; Cairoli, S; Baracco, A
2000-01-01
Over the last few years the Authors developed and implemented, a specific check-list for a "rapid" assessment of occupational exposure to repetitive movements and exertion of the upper limbs, after verifying the lack of such a tool which also had to be coherent with the latest data in the specialized literature. The check-list model and the relevant application procedures are presented and discussed. The check-list was applied by trained factory technicians in 46 different working tasks where the OCRA method previously proposed by the Authors was also applied by independent observers. Since 46 pairs of observation data were available (OCRA index and check-list score) it was possible to verify, via parametric and nonparametric statistical tests, the level of association between the two variables and to find the best simple regression function (exponential in this case) of the OCRA index from the check-list score. By means of this function, which was highly significant (R2 = 0.98, p < 0.0000), the values of the check-list score which better corresponded to the critical values (for exposure assessment) of the OCRA index looked for. The following correspondance values between OCRA Index and check-list were then established with a view to classifying exposure levels. The check-list "critical" scores were established considering the need for obtaining, in borderline cases, a potential effect of overestimation of the exposure level. On the basis of practical application experience and the preliminary validation results, recommendations are made and the caution needed in the use of the check-list is suggested.
Use of a remote computer terminal during field checking of Landsat digital maps
Robinove, Charles J.; Hutchinson, C.F.
1978-01-01
Field checking of small-scale land classification maps made digitally from Landsat data is facilitated by use of a remote portable teletypewriter terminal linked by teleplume to the IDIMS (Interactive Digital Image Manipulation System) at the EDC (EROS Data Center), Sioux Falls, S. Dak. When field checking of maps 20 miles northeast of Baker, Calif., during the day showed that changes in classification were needed, the terminal was used at night to combine image statistical files, remap portions of images, and produce new alphanumeric maps for field checking during the next day. The alphanumeric maps can be used without serious difficulty in location in the field even though the scale is distorted, and statistical files created during the field check can be used for full image classification and map output at the EDC. This process makes field checking faster than normal, provides interaction with the statistical data while in the field, and reduces to a minimum the number of trips needed to work interactively with the IDIMS at the EDC, thus saving significant amounts of time and money. The only significant problem is using telephone lines which at times create spurious characters in the printout or prevent the line feed (paper advance) signal from reaching the terminal, thus overprinting lines which should be sequential. We recommend that maps for field checking be made with more spectral classes than are expected because in the field it is much easier to group classes than to reclassify or separate classes when only the remote terminal is available for display.
Loop models, modular invariance, and three-dimensional bosonization
NASA Astrophysics Data System (ADS)
Goldman, Hart; Fradkin, Eduardo
2018-05-01
We consider a family of quantum loop models in 2+1 spacetime dimensions with marginally long-ranged and statistical interactions mediated by a U (1 ) gauge field, both purely in 2+1 dimensions and on a surface in a (3+1)-dimensional bulk system. In the absence of fractional spin, these theories have been shown to be self-dual under particle-vortex duality and shifts of the statistical angle of the loops by 2 π , which form a subgroup of the modular group, PSL (2 ,Z ) . We show that careful consideration of fractional spin in these theories completely breaks their statistical periodicity and describe how this occurs, resolving a disagreement with the conformal field theories they appear to approach at criticality. We show explicitly that incorporation of fractional spin leads to loop model dualities which parallel the recent web of (2+1)-dimensional field theory dualities, providing a nontrivial check on its validity.
Implication of correlations among some common stability statistics - a Monte Carlo simulations.
Piepho, H P
1995-03-01
Stability analysis of multilocation trials is often based on a mixed two-way model. Two stability measures in frequent use are the environmental variance (S i (2) )and the ecovalence (W i). Under the two-way model the rank orders of the expected values of these two statistics are identical for a given set of genotypes. By contrast, empirical rank correlations among these measures are consistently low. This suggests that the two-way mixed model may not be appropriate for describing real data. To check this hypothesis, a Monte Carlo simulation was conducted. It revealed that the low empirical rank correlation amongS i (2) and W i is most likely due to sampling errors. It is concluded that the observed low rank correlation does not invalidate the two-way model. The paper also discusses tests for homogeneity of S i (2) as well as implications of the two-way model for the classification of stability statistics.
Principles of continuous quality improvement applied to intravenous therapy.
Dunavin, M K; Lane, C; Parker, P E
1994-01-01
Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.
Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.
Oberg, Ann L; Mahoney, Douglas W
2012-01-01
Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.
Fast and global authenticity screening of honey using ¹H-NMR profiling.
Spiteri, Marc; Jamin, Eric; Thomas, Freddy; Rebours, Agathe; Lees, Michèle; Rogers, Karyne M; Rutledge, Douglas N
2015-12-15
An innovative analytical approach was developed to tackle the most common adulterations and quality deviations in honey. Using proton-NMR profiling coupled to suitable quantification procedures and statistical models, analytical criteria were defined to check the authenticity of both mono- and multi-floral honey. The reference data set used was a worldwide collection of more than 800 honeys, covering most of the economically significant botanical and geographical origins. Typical plant nectar markers can be used to check monofloral honey labeling. Spectral patterns and natural variability were established for multifloral honeys, and marker signals for sugar syrups were identified by statistical comparison with a commercial dataset of ca. 200 honeys. Although the results are qualitative, spiking experiments have confirmed the ability of the method to detect sugar addition down to 10% levels in favorable cases. Within the same NMR experiments, quantification of glucose, fructose, sucrose and 5-HMF (regulated parameters) was performed. Finally markers showing the onset of fermentation are described. Copyright © 2014 Elsevier Ltd. All rights reserved.
Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Johan, E-mail: anderson.johan@gmail.com; Halpern, Federico D.; Ricci, Paolo
The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis ofmore » the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.« less
Nowcasting and Forecasting the Monthly Food Stamps Data in the US Using Online Search Data
Fantazzini, Dean
2014-01-01
We propose the use of Google online search data for nowcasting and forecasting the number of food stamps recipients. We perform a large out-of-sample forecasting exercise with almost 3000 competing models with forecast horizons up to 2 years ahead, and we show that models including Google search data statistically outperform the competing models at all considered horizons. These results hold also with several robustness checks, considering alternative keywords, a falsification test, different out-of-samples, directional accuracy and forecasts at the state-level. PMID:25369315
Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun; Niu, Hongli
2015-10-01
Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.
Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.
Verde, Pablo E
2010-12-30
In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.
Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer
2006-03-01
able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem
Semantic Importance Sampling for Statistical Model Checking
2014-10-18
we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with rare events. Our results indicate that SIS reduces...background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In Section 6, we present our experiments and results...Syntactic Extraction ∗( ) dReal + Refinement ∗ |∗| , Monte-Carlo , Fig. 5. Architecture of osmosis
Kerns, James R; Followill, David S; Lowenstein, Jessica; Molineu, Andrea; Alvarez, Paola; Taylor, Paige A; Stingo, Francesco C; Kry, Stephen F
2016-05-01
Accurate data regarding linear accelerator (Linac) radiation characteristics are important for treatment planning system modeling as well as regular quality assurance of the machine. The Imaging and Radiation Oncology Core-Houston (IROC-H) has measured the dosimetric characteristics of numerous machines through their on-site dosimetry review protocols. Photon data are presented and can be used as a secondary check of acquired values, as a means to verify commissioning a new machine, or in preparation for an IROC-H site visit. Photon data from IROC-H on-site reviews from 2000 to 2014 were compiled and analyzed. Specifically, data from approximately 500 Varian machines were analyzed. Each dataset consisted of point measurements of several dosimetric parameters at various locations in a water phantom to assess the percentage depth dose, jaw output factors, multileaf collimator small field output factors, off-axis factors, and wedge factors. The data were analyzed by energy and parameter, with similarly performing machine models being assimilated into classes. Common statistical metrics are presented for each machine class. Measurement data were compared against other reference data where applicable. Distributions of the parameter data were shown to be robust and derive from a student's t distribution. Based on statistical and clinical criteria, all machine models were able to be classified into two or three classes for each energy, except for 6 MV for which there were eight classes. Quantitative analysis of the measurements for 6, 10, 15, and 18 MV photon beams is presented for each parameter; supplementary material has also been made available which contains further statistical information. IROC-H has collected numerous data on Varian Linacs and the results of photon measurements from the past 15 years are presented. The data can be used as a comparison check of a physicist's acquired values. Acquired values that are well outside the expected distribution should be verified by the physicist to identify whether the measurements are valid. Comparison of values to this reference data provides a redundant check to help prevent gross dosimetric treatment errors.
Robust Linear Models for Cis-eQTL Analysis.
Rantalainen, Mattias; Lindgren, Cecilia M; Holmes, Christopher C
2015-01-01
Expression Quantitative Trait Loci (eQTL) analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives), and to some extent also type I errors (false positives). Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.
Bender, Anne Mette; Kawachi, Ichiro; Jørgensen, Torben; Pisinger, Charlotta
2015-01-01
We sought to examine whether neighborhood deprivation is associated with participation in a large population-based health check. Such analyses will help answer the question whether health checks, which are designed to meet the needs of residents in deprived neighborhoods, may increase participation and prove to be more effective in preventing disease. In Europe, no study has previously looked at the association between neighborhood deprivation and participation in a population-based health check. The study population comprised 12,768 persons invited for a health check including screening for ischemic heart disease and lifestyle counseling. The study population was randomly drawn from a population of 179,097 persons living in 73 neighborhoods in Denmark. Data on neighborhood deprivation (percentage with basic education, with low income and not in work) and individual socioeconomic position were retrieved from national administrative registers. Multilevel regression analyses with log links and binary distributions were conducted to obtain relative risks, intraclass correlation coefficients and proportional change in variance. Large differences between neighborhoods existed in both deprivation levels and neighborhood health check participation rate (mean 53%; range 35-84%). In multilevel analyses adjusted for age and sex, higher levels of all three indicators of neighborhood deprivation and a deprivation score were associated with lower participation in a dose-response fashion. Persons living in the most deprived neighborhoods had up to 37% decreased probability of participating compared to those living in the least deprived neighborhoods. Inclusion of individual socioeconomic position in the model attenuated the neighborhood deprivation coefficients, but all except for income deprivation remained statistically significant. Neighborhood deprivation was associated with participation in a population-based health check in a dose-response manner, in which increasing neighborhood deprivation was associated with decreasing participation. This suggests the need to develop preventive health checks tailored to deprived neighborhoods.
Baqué, Michèle; Amendt, Jens
2013-01-01
Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.
Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello
2016-01-01
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.
The impact of joint responses of devices in an airport security system.
Nie, Xiaofeng; Batta, Rajan; Drury, Colin G; Lin, Li
2009-02-01
In this article, we consider a model for an airport security system in which the declaration of a threat is based on the joint responses of inspection devices. This is in contrast to the typical system in which each check station independently declares a passenger as having a threat or not having a threat. In our framework the declaration of threat/no-threat is based upon the passenger scores at the check stations he/she goes through. To do this we use concepts from classification theory in the field of multivariate statistics analysis and focus on the main objective of minimizing the expected cost of misclassification. The corresponding correct classification and misclassification probabilities can be obtained by using a simulation-based method. After computing the overall false alarm and false clear probabilities, we compare our joint response system with two other independently operated systems. A model that groups passengers in a manner that minimizes the false alarm probability while maintaining the false clear probability within specifications set by a security authority is considered. We also analyze the staffing needs at each check station for such an inspection scheme. An illustrative example is provided along with sensitivity analysis on key model parameters. A discussion is provided on some implementation issues, on the various assumptions made in the analysis, and on potential drawbacks of the approach.
Statistical Quality Control of Moisture Data in GEOS DAS
NASA Technical Reports Server (NTRS)
Dee, D. P.; Rukhovets, L.; Todling, R.
1999-01-01
A new statistical quality control algorithm was recently implemented in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The final step in the algorithm consists of an adaptive buddy check that either accepts or rejects outlier observations based on a local statistical analysis of nearby data. A basic assumption in any such test is that the observed field is spatially coherent, in the sense that nearby data can be expected to confirm each other. However, the buddy check resulted in excessive rejection of moisture data, especially during the Northern Hemisphere summer. The analysis moisture variable in GEOS DAS is water vapor mixing ratio. Observational evidence shows that the distribution of mixing ratio errors is far from normal. Furthermore, spatial correlations among mixing ratio errors are highly anisotropic and difficult to identify. Both factors contribute to the poor performance of the statistical quality control algorithm. To alleviate the problem, we applied the buddy check to relative humidity data instead. This variable explicitly depends on temperature and therefore exhibits a much greater spatial coherence. As a result, reject rates of moisture data are much more reasonable and homogeneous in time and space.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Truth, models, model sets, AIC, and multimodel inference: a Bayesian perspective
Barker, Richard J.; Link, William A.
2015-01-01
Statistical inference begins with viewing data as realizations of stochastic processes. Mathematical models provide partial descriptions of these processes; inference is the process of using the data to obtain a more complete description of the stochastic processes. Wildlife and ecological scientists have become increasingly concerned with the conditional nature of model-based inference: what if the model is wrong? Over the last 2 decades, Akaike's Information Criterion (AIC) has been widely and increasingly used in wildlife statistics for 2 related purposes, first for model choice and second to quantify model uncertainty. We argue that for the second of these purposes, the Bayesian paradigm provides the natural framework for describing uncertainty associated with model choice and provides the most easily communicated basis for model weighting. Moreover, Bayesian arguments provide the sole justification for interpreting model weights (including AIC weights) as coherent (mathematically self consistent) model probabilities. This interpretation requires treating the model as an exact description of the data-generating mechanism. We discuss the implications of this assumption, and conclude that more emphasis is needed on model checking to provide confidence in the quality of inference.
Implementation and Impact of the Check & Connect Mentoring Program
ERIC Educational Resources Information Center
Heppen, Jessica; O'Cummings, Mindee; Poland, Lindsay; Zeiser, Krissy; Mills, Nicholas
2015-01-01
High school graduation rates remain unacceptably low in the U.S., especially among disadvantaged youth (Chapman, Laird, Ifill, & KelalRamani, 2011; Stillwell, 2010), with troubling implications for future earnings and employment status (Bureau of Labor Statistics, 2012). Check & Connect (C&C) is an individualized program that pairs…
Statistical inference in comparing DInSAR and GPS data in fault areas
NASA Astrophysics Data System (ADS)
Barzaghi, R.; Borghi, A.; Kunzle, A.
2012-04-01
DInSAR and GPS data are nowadays currently used in geophysical investigation, e.g. for estimating slip rate over the fault plane in seismogenic areas. This analysis is usually done by mapping the surface deformation rates as estimated by GPS and DInSAR over the fault plane using suitable geophysical models (e.g. the Okada model). Usually, DInSAR vertical velocities and GPS horizontal velocities are used for getting an integrated slip estimate. However, it is sometimes critical to merge the two kinds of information since they may reflect a common undergoing geophysical signal plus different disturbing signals that are not related to the fault dynamic. In GPS and DInSAR data analysis, these artifacts are mainly connected to signal propagation in the atmosphere and to hydrological phenomena (e.g. variation in the water table). Thus, some coherence test between the two information must be carried out in order to properly merge the GPS and DInSAR velocities in the inversion procedure. To this aim, statistical tests have been studied to check for the compatibility of the two deformation rate estimates coming from GPS and DInSAR data analysis. This has been done according both to standard and Bayesian testing methodology. The effectiveness of the proposed inference methods has been checked with numerical simulations in the case of a normal fault. The fault structure is defined following the Pollino fault model and both GPS and DInSAR data are simulated according to real data acquired in this area.
Singh, Kunwar P; Gupta, Shikha; Ojha, Priyanka; Rai, Premanjali
2013-04-01
The research aims to develop artificial intelligence (AI)-based model to predict the adsorptive removal of 2-chlorophenol (CP) in aqueous solution by coconut shell carbon (CSC) using four operational variables (pH of solution, adsorbate concentration, temperature, and contact time), and to investigate their effects on the adsorption process. Accordingly, based on a factorial design, 640 batch experiments were conducted. Nonlinearities in experimental data were checked using Brock-Dechert-Scheimkman (BDS) statistics. Five nonlinear models were constructed to predict the adsorptive removal of CP in aqueous solution by CSC using four variables as input. Performances of the constructed models were evaluated and compared using statistical criteria. BDS statistics revealed strong nonlinearity in experimental data. Performance of all the models constructed here was satisfactory. Radial basis function network (RBFN) and multilayer perceptron network (MLPN) models performed better than generalized regression neural network, support vector machines, and gene expression programming models. Sensitivity analysis revealed that the contact time had highest effect on adsorption followed by the solution pH, temperature, and CP concentration. The study concluded that all the models constructed here were capable of capturing the nonlinearity in data. A better generalization and predictive performance of RBFN and MLPN models suggested that these can be used to predict the adsorption of CP in aqueous solution using CSC.
Statistical Anomalies of Bitflips in SRAMs to Discriminate SBUs From MCUs
NASA Astrophysics Data System (ADS)
Clemente, Juan Antonio; Franco, Francisco J.; Villa, Francesca; Baylac, Maud; Rey, Solenne; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul
2016-08-01
Recently, the occurrence of multiple events in static tests has been investigated by checking the statistical distribution of the difference between the addresses of the words containing bitflips. That method has been successfully applied to Field Programmable Gate Arrays (FPGAs) and the original authors indicate that it is also valid for SRAMs. This paper presents a modified methodology that is based on checking the XORed addresses with bitflips, rather than on the difference. Irradiation tests on CMOS 130 & 90 nm SRAMs with 14-MeV neutrons have been performed to validate this methodology. Results in high-altitude environments are also presented and cross-checked with theoretical predictions. In addition, this methodology has also been used to detect modifications in the organization of said memories. Theoretical predictions have been validated with actual data provided by the manufacturer.
Addressing Dynamic Issues of Program Model Checking
NASA Technical Reports Server (NTRS)
Lerda, Flavio; Visser, Willem
2001-01-01
Model checking real programs has recently become an active research area. Programs however exhibit two characteristics that make model checking difficult: the complexity of their state and the dynamic nature of many programs. Here we address both these issues within the context of the Java PathFinder (JPF) model checker. Firstly, we will show how the state of a Java program can be encoded efficiently and how this encoding can be exploited to improve model checking. Next we show how to use symmetry reductions to alleviate some of the problems introduced by the dynamic nature of Java programs. Lastly, we show how distributed model checking of a dynamic program can be achieved, and furthermore, how dynamic partitions of the state space can improve model checking. We support all our findings with results from applying these techniques within the JPF model checker.
Evaluation of the implementation of an integrated program for musculoskeletal system care.
Larrañaga, Igor; Soto-Gordoa, Myriam; Arrospide, Arantzazu; Jauregi, María Luz; Millas, Jesús; San Vicente, Ricardo; Aguirrebeña, Jabier; Mar, Javier
The chronic nature of musculoskeletal diseases requires an integrated care which involves the Primary Care and the specialities of Rheumatology, Traumatology and Rehabilitation. The aim of this study was to assess the implementation of an integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease using Deming's continuous improvement process and considering referrals and resource consumption. A simulation model was used in the planning to predict the evolution of musculoskeletal diseases resource consumption and to carry out a Budget Impact Analysis from 2012 to 2020 in the Goierri-Alto Urola region. In the checking stage the status of the process in 2014 was evaluated using statistical analysis to check the degree of achievement of the objectives for each speciality. Simulation models showed that population with musculoskeletal disease in Goierri-Alto Urola will increase a 4.4% by 2020. Because of that, the expenses for a conventional healthcare system will have increased a 5.9%. However, if the intervention reaches its objectives the budget would decrease an 8.5%. The statistical analysis evidenced a decline in referrals to Traumatology service and a reduction of successive consultations in all specialities. The implementation of the integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease is still at an early stage. However, the empowerment of Primary Care improved patient referrals and reduced the costs. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
Effective connectivity: Influence, causality and biophysical modeling
Valdes-Sosa, Pedro A.; Roebroeck, Alard; Daunizeau, Jean; Friston, Karl
2011-01-01
This is the final paper in a Comments and Controversies series dedicated to “The identification of interacting networks in the brain using fMRI: Model selection, causality and deconvolution”. We argue that discovering effective connectivity depends critically on state-space models with biophysically informed observation and state equations. These models have to be endowed with priors on unknown parameters and afford checks for model Identifiability. We consider the similarities and differences among Dynamic Causal Modeling, Granger Causal Modeling and other approaches. We establish links between past and current statistical causal modeling, in terms of Bayesian dependency graphs and Wiener–Akaike–Granger–Schweder influence measures. We show that some of the challenges faced in this field have promising solutions and speculate on future developments. PMID:21477655
NASA Technical Reports Server (NTRS)
Lee, Sangsan; Lele, Sanjiva K.; Moin, Parviz
1992-01-01
For the numerical simulation of inhomogeneous turbulent flows, a method is developed for generating stochastic inflow boundary conditions with a prescribed power spectrum. Turbulence statistics from spatial simulations using this method with a low fluctuation Mach number are in excellent agreement with the experimental data, which validates the procedure. Turbulence statistics from spatial simulations are also compared to those from temporal simulations using Taylor's hypothesis. Statistics such as turbulence intensity, vorticity, and velocity derivative skewness compare favorably with the temporal simulation. However, the statistics of dilatation show a significant departure from those obtained in the temporal simulation. To directly check the applicability of Taylor's hypothesis, space-time correlations of fluctuations in velocity, vorticity, and dilatation are investigated. Convection velocities based on vorticity and velocity fluctuations are computed as functions of the spatial and temporal separations. The profile of the space-time correlation of dilatation fluctuations is explained via a wave propagation model.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Program Model Checking: A Practitioner's Guide
NASA Technical Reports Server (NTRS)
Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.
2008-01-01
Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.
Statistical analysis of the count and profitability of air conditioners.
Rady, El Houssainy A; Mohamed, Salah M; Abd Elmegaly, Alaa A
2018-08-01
This article presents the statistical analysis of the number and profitability of air conditioners in an Egyptian company. Checking the same distribution for each categorical variable has been made using Kruskal-Wallis test.
Verifying Multi-Agent Systems via Unbounded Model Checking
NASA Technical Reports Server (NTRS)
Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.
2004-01-01
We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems
Semantic Importance Sampling for Statistical Model Checking
2015-01-16
SMT calls while maintaining correctness. Finally, we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with...2 surveys related work. Section 3 presents background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In...which I∗M|=Φ(x) = 1. We do this by first randomly selecting a cube c from C∗ with uniform probability since each cube has equal probability 9 5. OSMOSIS
Qin, Heng; Zuo, Yong; Zhang, Dong; Li, Yinghui; Wu, Jian
2017-03-06
Through slight modification on typical photon multiplier tube (PMT) receiver output statistics, a generalized received response model considering both scattered propagation and random detection is presented to investigate the impact of inter-symbol interference (ISI) on link data rate of short-range non-line-of-sight (NLOS) ultraviolet communication. Good agreement with the experimental results by numerical simulation is shown. Based on the received response characteristics, a heuristic check matrix construction algorithm of low-density-parity-check (LDPC) code is further proposed to approach the data rate bound derived in a delayed sampling (DS) binary pulse position modulation (PPM) system. Compared to conventional LDPC coding methods, better bit error ratio (BER) below 1E-05 is achieved for short-range NLOS UVC systems operating at data rate of 2Mbps.
1977-07-01
SIZE C XNI. C UE2 - UTILITY OF EXPERIMENT OF SIZE C XN2. C ICHECK - VARIABLE USLD TO CHECK FOR C TERMINATION, C~C DIMENSION SUBLIM{20),UPLIM(20),UEI(20...1J=UPLIM(K4-I)-XNI (K+1)+SU8LIt1(K+i*. C CHECK FOR TERMINATION. 944 ICHECK =SUBLIM(K)+2 IFIICHECK.GEUPLiHMK.,OR.K.G1.20’ GO TO 930 GO TO 920 930
Toward improved design of check dam systems: A case study in the Loess Plateau, China
NASA Astrophysics Data System (ADS)
Pal, Debasish; Galelli, Stefano; Tang, Honglei; Ran, Qihua
2018-04-01
Check dams are one of the most common strategies for controlling sediment transport in erosion prone areas, along with soil and water conservation measures. However, existing mathematical models that simulate sediment production and delivery are often unable to simulate how the storage capacity of check dams varies with time. To explicitly account for this process-and to support the design of check dam systems-we developed a modelling framework consisting of two components, namely (1) the spatially distributed Soil Erosion and Sediment Delivery Model (WaTEM/SEDEM), and (2) a network-based model of check dam storage dynamics. The two models are run sequentially, with the second model receiving the initial sediment input to check dams from WaTEM/SEDEM. The framework is first applied to Shejiagou catchment, a 4.26 km2 area located in the Loess Plateau, China, where we study the effect of the existing check dam system on sediment dynamics. Results show that the deployment of check dams altered significantly the sediment delivery ratio of the catchment. Furthermore, the network-based model reveals a large variability in the life expectancy of check dams and abrupt changes in their filling rates. The application of the framework to six alternative check dam deployment scenarios is then used to illustrate its usefulness for planning purposes, and to derive some insights on the effect of key decision variables, such as the number, size, and site location of check dams. Simulation results suggest that better performance-in terms of life expectancy and sediment delivery ratio-could have been achieved with an alternative deployment strategy.
Implementing Model-Check for Employee and Management Satisfaction
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.
Survival analysis in hematologic malignancies: recommendations for clinicians
Delgado, Julio; Pereira, Arturo; Villamor, Neus; López-Guillermo, Armando; Rozman, Ciril
2014-01-01
The widespread availability of statistical packages has undoubtedly helped hematologists worldwide in the analysis of their data, but has also led to the inappropriate use of statistical methods. In this article, we review some basic concepts of survival analysis and also make recommendations about how and when to perform each particular test using SPSS, Stata and R. In particular, we describe a simple way of defining cut-off points for continuous variables and the appropriate and inappropriate uses of the Kaplan-Meier method and Cox proportional hazard regression models. We also provide practical advice on how to check the proportional hazards assumption and briefly review the role of relative survival and multiple imputation. PMID:25176982
Symbolic LTL Compilation for Model Checking: Extended Abstract
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2007-01-01
In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.
Reducing Check-in Errors at Brigham Young University through Statistical Process Control
ERIC Educational Resources Information Center
Spackman, N. Andrew
2005-01-01
The relationship between the library and its patrons is damaged and the library's reputation suffers when returned items are not checked in. An informal survey reveals librarians' concern for this problem and their efforts to combat it, although few libraries collect objective measurements of errors or the effects of improvement efforts. Brigham…
This SOP describes the methods and procedures for two types of QA procedures: spot checks of hand entered data, and QA procedures for co-located and split samples. The spot checks were used to determine whether the error rate goal for the input of hand entered data was being att...
Finding Feasible Abstract Counter-Examples
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)
2002-01-01
A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.
Code of Federal Regulations, 2014 CFR
2014-01-01
... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... in different states or check processing regions)]. If you make the deposit in person to one of our...] Substitute Checks and Your Rights What Is a Substitute Check? To make check processing faster, federal law...
NASA Astrophysics Data System (ADS)
Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.
2018-04-01
The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.
Optimal designs for copula models
Perrone, E.; Müller, W.G.
2016-01-01
Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616
Application of conditional moment tests to model checking for generalized linear models.
Pan, Wei
2002-06-01
Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.
Take the Reins on Model Quality with ModelCHECK and Gatekeeper
NASA Technical Reports Server (NTRS)
Jones, Corey
2012-01-01
Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.
Statistical Literacy in Public Debate--Examples from the UK 2015 General Election
ERIC Educational Resources Information Center
Arnold, Phoebe
2017-01-01
Full Fact is an independent, non-partisan fact-checking charity. A particular focus is the analysis of factual claims in political debate in the UK; for example, fact-checking claims and counterclaims made during Prime Minister's questions. Facts do not appear in a vacuum as they are often used as key elements in an effort to make a coherent…
Sasaki, Kei; Sasaki, Hiroto; Takahashi, Atsuki; Kang, Siu; Yuasa, Tetsuya; Kato, Ryuji
2016-02-01
In recent years, cell and tissue therapy in regenerative medicine have advanced rapidly towards commercialization. However, conventional invasive cell quality assessment is incompatible with direct evaluation of the cells produced for such therapies, especially in the case of regenerative medicine products. Our group has demonstrated the potential of quantitative assessment of cell quality, using information obtained from cell images, for non-invasive real-time evaluation of regenerative medicine products. However, image of cells in the confluent state are often difficult to evaluate, because accurate recognition of cells is technically difficult and the morphological features of confluent cells are non-characteristic. To overcome these challenges, we developed a new image-processing algorithm, heterogeneity of orientation (H-Orient) processing, to describe the heterogeneous density of cells in the confluent state. In this algorithm, we introduced a Hessian calculation that converts pixel intensity data to orientation data and a statistical profiling calculation that evaluates the heterogeneity of orientations within an image, generating novel parameters that yield a quantitative profile of an image. Using such parameters, we tested the algorithm's performance in discriminating different qualities of cellular images with three types of clinically important cell quality check (QC) models: remaining lifespan check (QC1), manipulation error check (QC2), and differentiation potential check (QC3). Our results show that our orientation analysis algorithm could predict with high accuracy the outcomes of all types of cellular quality checks (>84% average accuracy with cross-validation). Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Schmidt, Paul; Schmid, Volker J; Gaser, Christian; Buck, Dorothea; Bührlen, Susanne; Förschler, Annette; Mühlau, Mark
2013-01-01
Aiming at iron-related T2-hypointensity, which is related to normal aging and neurodegenerative processes, we here present two practicable approaches, based on Bayesian inference, for preprocessing and statistical analysis of a complex set of structural MRI data. In particular, Markov Chain Monte Carlo methods were used to simulate posterior distributions. First, we rendered a segmentation algorithm that uses outlier detection based on model checking techniques within a Bayesian mixture model. Second, we rendered an analytical tool comprising a Bayesian regression model with smoothness priors (in the form of Gaussian Markov random fields) mitigating the necessity to smooth data prior to statistical analysis. For validation, we used simulated data and MRI data of 27 healthy controls (age: [Formula: see text]; range, [Formula: see text]). We first observed robust segmentation of both simulated T2-hypointensities and gray-matter regions known to be T2-hypointense. Second, simulated data and images of segmented T2-hypointensity were analyzed. We found not only robust identification of simulated effects but also a biologically plausible age-related increase of T2-hypointensity primarily within the dentate nucleus but also within the globus pallidus, substantia nigra, and red nucleus. Our results indicate that fully Bayesian inference can successfully be applied for preprocessing and statistical analysis of structural MRI data.
Occurrence analysis of daily rainfalls through non-homogeneous Poissonian processes
NASA Astrophysics Data System (ADS)
Sirangelo, B.; Ferrari, E.; de Luca, D. L.
2011-06-01
A stochastic model based on a non-homogeneous Poisson process, characterised by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. The data modelling has been performed with a partition of observed daily rainfall data into a calibration period for parameter estimation and a validation period for checking on occurrence process changes. The model has been applied to a set of rain gauges located in different geographical areas of Southern Italy. The results show a good fit for time-varying intensity of rainfall occurrence process by 2-harmonic Fourier law and no statistically significant evidence of changes in the validation period for different threshold values.
NASA Astrophysics Data System (ADS)
Trojková, Darina; Judas, Libor; Trojek, Tomáš
2014-11-01
Minimizing the late rectal toxicity of prostate cancer patients is a very important and widely-discussed topic. Normal tissue complication probability (NTCP) models can be used to evaluate competing treatment plans. In our work, the parameters of the Lyman-Kutcher-Burman (LKB), Källman, and Logit+EUD models are optimized by minimizing the Brier score for a group of 302 prostate cancer patients. The NTCP values are calculated and are compared with the values obtained using previously published values for the parameters. χ2 Statistics were calculated as a check of goodness of optimization.
Model checking for linear temporal logic: An efficient implementation
NASA Technical Reports Server (NTRS)
Sherman, Rivi; Pnueli, Amir
1990-01-01
This report provides evidence to support the claim that model checking for linear temporal logic (LTL) is practically efficient. Two implementations of a linear temporal logic model checker is described. One is based on transforming the model checking problem into a satisfiability problem; the other checks an LTL formula for a finite model by computing the cross-product of the finite state transition graph of the program with a structure containing all possible models for the property. An experiment was done with a set of mutual exclusion algorithms and tested safety and liveness under fairness for these algorithms.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy Disclosure and Notices C Appendix C to Part 229 Banks and... OF FUNDS AND COLLECTION OF CHECKS (REGULATION CC) Pt. 229, App. C Appendix C to Part 229—Model...
Harvesting river water through small dams promote positive environmental impact.
Agoramoorthy, Govindasamy; Chaudhary, Sunita; Chinnasamy, Pennan; Hsu, Minna J
2016-11-01
While deliberations relating to negative consequences of large dams on the environment continue to dominate world attention, positive benefits provided by small dams, also known as check dams, go unobserved. Besides, little is known about the potential of check dams in mitigating global warming impacts due to less data availability. Small dams are usually commissioned to private contractors who do not have clear mandate from their employers to post their work online for public scrutiny. As a result, statistics on the design, cost, and materials used to build check dams are not available in public domain. However, this review paper presents data for the first time on the often ignored potential of check dams mitigating climate-induced hydrological threats. We hope that the scientific analysis presented in this paper will promote further research on check dams worldwide to better comprehend their eco-friendly significance serving society.
Current Use of Underage Alcohol Compliance Checks by Enforcement Agencies in the U.S.
Erickson, Darin J.; Lenk, Kathleen M.; Sanem, Julia R.; Nelson, Toben F.; Jones-Webb, Rhonda; Toomey, Traci L.
2014-01-01
Background Compliance checks conducted by law enforcement agents can significantly reduce the likelihood of illegal alcohol sales to underage individuals, but these checks need to be conducted using optimal methods to maintain effectiveness. Materials and Methods We conducted a national survey of local and state enforcement agencies in 2010–2011 to assess: (1) how many agencies are currently conducting underage alcohol compliance checks, (2) how many agencies that conduct compliance checks use optimal methods—including checking all establishments in the jurisdiction, conducting checks at least 3–4 times per year, conducting follow-up checks within 3 months, and penalizing the licensee (not only the server/clerk) for failing a compliance check, and (3) characteristics of the agencies that conduct compliance checks. Results Just over one third of local law enforcement agencies and over two thirds of state agencies reported conducting compliance checks. However, only a small percentage of the agencies (4–6%) reported using all of the optimal methods to maximize effectiveness of these compliance checks. Local law enforcement agencies with an alcohol-related division, those with at least one full-time officer assigned to work on alcohol, and those in larger communities were significantly more likely to conduct compliance checks. State agencies with more full-time agents and those located in states where the state agency or both state and local enforcement agencies have primary responsibility (vs. only the local law agency) for enforcing alcohol retail laws were also more likely to conduct compliance checks; however, these agency characteristics did not remain statistically significant in the multivariate analyses. Conclusions Continued effort is needed to increase the number of local and state agencies conducting compliance checks using optimal methods to reduce youth access to alcohol. PMID:24716443
Current use of underage alcohol compliance checks by enforcement agencies in the United States.
Erickson, Darin J; Lenk, Kathleen M; Sanem, Julia R; Nelson, Toben F; Jones-Webb, Rhonda; Toomey, Traci L
2014-06-01
Compliance checks conducted by law enforcement agents can significantly reduce the likelihood of illegal alcohol sales to underage individuals, but these checks need to be conducted using optimal methods to maintain effectiveness. We conducted a national survey of local and state enforcement agencies from 2010 to 2011 to assess: (i) how many agencies are currently conducting underage alcohol compliance checks, (ii) how many agencies that conduct compliance checks use optimal methods-including checking all establishments in the jurisdiction, conducting checks at least 3 to 4 times per year, conducting follow-up checks within 3 months, and penalizing the licensee (not only the server/clerk) for failing a compliance check, and (iii) characteristics of the agencies that conduct compliance checks. Just over one-third of local law enforcement agencies and over two-thirds of state agencies reported conducting compliance checks. However, only a small percentage of the agencies (4 to 6%) reported using all of the optimal methods to maximize effectiveness of these compliance checks. Local law enforcement agencies with an alcohol-related division, those with at least 1 full-time officer assigned to work on alcohol, and those in larger communities were significantly more likely to conduct compliance checks. State agencies with more full-time agents and those located in states where the state agency or both state and local enforcement agencies have primary responsibility (vs. only the local law agency) for enforcing alcohol retail laws were also more likely to conduct compliance checks; however, these agency characteristics did not remain statistically significant in the multivariate analyses. Continued effort is needed to increase the number of local and state agencies conducting compliance checks using optimal methods to reduce youth access to alcohol. Copyright © 2014 by the Research Society on Alcoholism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pursley, J; Gueorguiev, G; Prichard, H
Purpose: To demonstrate the commissioning of constant dose rate volumetric modulated arc therapy (VMAT) in the Raystation treatment planning system for a Varian Clinac iX with Exact couch. Methods: Constant dose rate (CDR) VMAT is an option in the Raystation treatment planning system, enabling VMAT delivery on Varian linacs without a RapidArc upgrade. Raystation 4.7 was used to commission CDR-VMAT for a Varian Clinac iX. Raystation arc model parameters were selected to match machine deliverability characteristics. A Varian Exact couch model was added to Raystation 4.7 and commissioned for use in VMAT optimization. CDR-VMAT commissioning checks were performed on themore » linac, including patient-specific QA measurements for 10 test patients using both the ArcCHECK from Sun Nuclear Corporation and COMPASS from IBA Dosimetry. Multi-criteria optimization (MCO) in Raystation was used for CDR-VMAT planning. Results: Raystation 4.7 generated clinically acceptable and deliverable CDR-VMAT plans for the Varian Clinac. VMAT plans were optimized including a model of the Exact couch with both rails in the out positions. CDR-VMAT plans generated with MCO in Raystation were dosimetrically comparable to Raystation MCO-generated IMRT plans. Patient-specific QA measurements with the ArcCHECK on the couch showed good agreement with the treatment planning system prediction. Patient-specific, structure-specific, multi-statistical parameter 3D QA measurements with gantry-mounted COMPASS also showed good agreement. Conclusion: Constant dose rate VMAT was successfully modeled in Raystation 4.7 for a Varian Clinac iX, and Raystation’s multicriteria optimization generated constant dose rate VMAT plans which were deliverable and dosimetrically comparable to IMRT plans.« less
Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system
NASA Astrophysics Data System (ADS)
Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.
2014-11-01
The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS Package for Observation Processing (KPOP) system for data assimilation, preprocessing and quality control modules for bending angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending angle operator and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research (NCAR) Community Atmosphere Model-Spectral Element (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS-LETKF data assimilation system, which has been successfully implemented to a cubed-sphere model with fully unstructured quadrilateral meshes. As a result of data processing, the bending angle departure statistics between observation and background shows significant improvement. Also, the first experiment in assimilating GPS-RO bending angle resulting from KPOP within KIAPS-LETKF shows encouraging results.
Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.
Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester
2016-11-01
Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.
The influence of social anxiety on the body checking behaviors of female college students.
White, Emily K; Warren, Cortney S
2014-09-01
Social anxiety and eating pathology frequently co-occur. However, there is limited research examining the relationship between anxiety and body checking, aside from one study in which social physique anxiety partially mediated the relationship between body checking cognitions and body checking behavior (Haase, Mountford, & Waller, 2007). In an independent sample of 567 college women, we tested the fit of Haase and colleagues' foundational model but did not find evidence of mediation. Thus we tested the fit of an expanded path model that included eating pathology and clinical impairment. In the best-fitting path model (CFI=.991; RMSEA=.083) eating pathology and social physique anxiety positively predicted body checking, and body checking positively predicted clinical impairment. Therefore, women who endorse social physique anxiety may be more likely to engage in body checking behaviors and experience impaired psychosocial functioning. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Bellera, Carine A.; Julien, Marilyse; Hanley, James A.
2010-01-01
The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…
Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...
2015-07-01
In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Code of Federal Regulations, 2012 CFR
2012-01-01
... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...
Code of Federal Regulations, 2011 CFR
2011-01-01
... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...
Code of Federal Regulations, 2013 CFR
2013-01-01
... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...
Determination of MLC model parameters for Monaco using commercial diode arrays.
Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian
2016-07-08
Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors
Propel: Tools and Methods for Practical Source Code Model Checking
NASA Technical Reports Server (NTRS)
Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem
2003-01-01
The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.
Gratton, D G; Kwon, S R; Blanchette, D R; Aquilino, S A
2017-11-01
Proper integration of newly emerging digital assessment tools is a central issue in dental education in an effort to provide more accurate and objective feedback to students. The study examined how the outcomes of students' tooth preparation were correlated when evaluated using traditional faculty assessment and two types of digital assessment approaches. Specifically, incorporation of the Romexis Compare 2.0 (Compare) and Sirona prepCheck 1.1 (prepCheck) systems was evaluated. Additionally, satisfaction of students based on the type of software was evaluated through a survey. Students in a second-year pre-clinical prosthodontics course were allocated to either Compare (n = 42) or prepCheck (n = 37) systems. All students received conventional instruction and used their assigned digital system as an additional evaluation tool to aid in assessing their work. Examinations assessed crown preparations of the maxillary right central incisor (#8) and the mandibular left first molar (#19). All submissions were graded by faculty, Compare and prepCheck. Technical scores did not differ between student groups for any of the assessment approaches. Compare and prepCheck had modest, statistically significant correlations with faculty scores with a minimum correlation of 0.3944 (P = 0.0011) and strong, statistically significant correlations with each other with a minimum correlation of 0.8203 (P < 0.0001). A post-course student survey found that 55.26% of the students felt unfavourably about learning the digital evaluation protocols. A total of 62.31% felt favourably about the integration of these digital tools into the curriculum. Comparison of Compare and prepCheck showed no evidence of significant difference in students' prosthodontics technical performance and perception. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Ecker, Willi; Kupfer, Jochen; Gönner, Sascha
2014-01-01
This paper examines the contribution of incompleteness/'not just right experiences' (NJREs) to an understanding of the relationship between obsessive-compulsive disorder (OCD) and obsessive-compulsive personality traits (OCPTs). It investigates the association of specific OCD symptom dimensions with OCPTs, conceptualized as continuous phenomena that are also observable below the diagnostic threshold. As empirical findings and clinical observation suggest that incompleteness feelings/NJREs may play a significant affective and motivational role for certain OCD subtypes, but also for patients with accentuated OCPTs, we hypothesized that OCPTs are selectively linked with incompleteness-associated OCD symptom dimensions (ordering, checking, hoarding and counting). Moreover, we assumed that this selective relationship cannot be demonstrated any more after statistical control of incompleteness, whereas it is preserved after statistical control of anxiety, depression, pathological worry and harm avoidance. Results from a study with a large clinical sample (n = 185) partially support these hypotheses and suggest that NJREs may be an important connecting link between specific OCD symptom dimensions, in particular ordering and checking, and accentuated OCPTs. Obsessive-compulsive personality traits (OCPTs) are positively related to obsessive-compulsive disorder symptom dimensions (ordering, checking, hoarding and counting) hypothesized or found to be associated with incompleteness/'not just right experiences' (NJREs), but not to washing and obsessions. This positive relationship, which is strongest for ordering and checking, is eliminated when NJREs are statistically controlled. Ordering, checking and accentuated OCPTs may share NJREs as a common affective-motivational underpinning.Dysfunctional behaviour patterns of people with accentuated OCPTs or obsessive-compulsive personality disorder (OCPD) may be viewed as efforts to avoid or reduce subjectively intolerable NJREs. On the basis of such a conceptualization of OCPD as an emotional disorder, a novel treatment approach for OCPD focusing on habituation to NJREs could be developed. Copyright © 2013 John Wiley & Sons, Ltd.
Program Model Checking as a New Trend
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.
UTP and Temporal Logic Model Checking
NASA Astrophysics Data System (ADS)
Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo
In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-21
... pressurise the hydraulic reservoirs, due to leakage of the Crissair reservoir air pressurisation check valves. * * * The leakage of the check valves was caused by an incorrect spring material. The affected Crissair check valves * * * were then replaced with improved check valves P/N [part number] 2S2794-1 * * *. More...
Teaching Statistics with Minitab II.
ERIC Educational Resources Information Center
Ryan, T. A., Jr.; And Others
Minitab is a statistical computing system which uses simple language, produces clear output, and keeps track of bookkeeping automatically. Error checking with English diagnostics and inclusion of several default options help to facilitate use of the system by students. Minitab II is an improved and expanded version of the original Minitab which…
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
The Relationship between Selected Personality Traits and Self-Esteem among Female Nursing Students.
ERIC Educational Resources Information Center
Lewis, John; And Others
1980-01-01
For a sample of 75 female senior nursing students the Tennessee Self-Concept Scale exhibited statistically significant correlations with the Adjective Check List (ACL) scales of Endurance, Nurturance, and Affiliation. Statistically significant negative correlations were found for the ACL scales of Aggression and Succorance. (Author/BW)
Cosmological velocity correlations - Observations and model predictions
NASA Technical Reports Server (NTRS)
Gorski, Krzysztof M.; Davis, Marc; Strauss, Michael A.; White, Simon D. M.; Yahil, Amos
1989-01-01
By applying the present simple statistics for two-point cosmological peculiar velocity-correlation measurements to the actual data sets of the Local Supercluster spiral galaxy of Aaronson et al. (1982) and the elliptical galaxy sample of Burstein et al. (1987), as well as to the velocity field predicted by the distribution of IRAS galaxies, a coherence length of 1100-1600 km/sec is obtained. Coherence length is defined as that separation at which the correlations drop to half their zero-lag value. These results are compared with predictions from two models of large-scale structure formation: that of cold dark matter and that of baryon isocurvature proposed by Peebles (1980). N-body simulations of these models are performed to check the linear theory predictions and measure sampling fluctuations.
Assessment of check-dam groundwater recharge with water-balance calculations
NASA Astrophysics Data System (ADS)
Djuma, Hakan; Bruggeman, Adriana; Camera, Corrado; Eliades, Marinos
2017-04-01
Studies on the enhancement of groundwater recharge by check-dams in arid and semi-arid environments mainly focus on deriving water infiltration rates from the check-dam ponding areas. This is usually achieved by applying simple water balance models, more advanced models (e.g., two dimensional groundwater models) and field tests (e.g., infiltrometer test or soil pit tests). Recharge behind the check-dam can be affected by the built-up of sediment as a result of erosion in the upstream watershed area. This natural process can increase the uncertainty in the estimates of the recharged water volume, especially for water balance calculations. Few water balance field studies of individual check-dams have been presented in the literature and none of them presented associated uncertainties of their estimates. The objectives of this study are i) to assess the effect of a check-dam on groundwater recharge from an ephemeral river; and ii) to assess annual sedimentation at the check-dam during a 4-year period. The study was conducted on a check-dam in the semi-arid island of Cyprus. Field campaigns were carried out to measure water flow, water depth and check-dam topography in order to establish check-dam water height, volume, evaporation, outflow and recharge relations. Topographic surveys were repeated at the end of consecutive hydrological years to estimate the sediment built up in the reservoir area of the check dam. Also, sediment samples were collected from the check-dam reservoir area for bulk-density analyses. To quantify the groundwater recharge, a water balance model was applied at two locations: at the check-dam and corresponding reservoir area, and at a 4-km stretch of the river bed without check-dam. Results showed that a check-dam with a storage capacity of 25,000 m3 was able to recharge to the aquifer, in four years, a total of 12 million m3 out of the 42 million m3 of measured (or modelled) streamflow. Recharge from the analyzed 4-km long river section without check-dam was estimated to be 1 million m3. Upper and lower limits of prediction intervals were computed to assess the uncertainties of the results. The model was rerun with these values and resulted in recharge values of 0.4 m3 as lower and 38 million m3 as upper limit. The sediment survey in the check-dam reservoir area showed that the reservoir area was filled with 2,000 to 3,000 tons of sediment after one rainfall season. This amount of sediment corresponds to 0.2 to 2 t h-1 y-1 sediment yield at the watershed level and reduces the check-dam storage capacity by approximately 10%. Results indicate that check-dams are valuable structures for increasing groundwater resources, but special attention should be given to soil erosion occurring in the upstream area and the resulting sediment built-up in the check-dam reservoir area. This study has received funding from the EU FP7 RECARE Project (GA 603498)
Soni, Kirti; Parmar, Kulwinder Singh; Kapoor, Sangeeta; Kumar, Nishant
2016-05-15
A lot of studies in the literature of Aerosol Optical Depth (AOD) done by using Moderate Resolution Imaging Spectroradiometer (MODIS) derived data, but the accuracy of satellite data in comparison to ground data derived from ARrosol Robotic NETwork (AERONET) has been always questionable. So to overcome from this situation, comparative study of a comprehensive ground based and satellite data for the period of 2001-2012 is modeled. The time series model is used for the accurate prediction of AOD and statistical variability is compared to assess the performance of the model in both cases. Root mean square error (RMSE), mean absolute percentage error (MAPE), stationary R-squared, R-squared, maximum absolute percentage error (MAPE), normalized Bayesian information criterion (NBIC) and Ljung-Box methods are used to check the applicability and validity of the developed ARIMA models revealing significant precision in the model performance. It was found that, it is possible to predict the AOD by statistical modeling using time series obtained from past data of MODIS and AERONET as input data. Moreover, the result shows that MODIS data can be formed from AERONET data by adding 0.251627 ± 0.133589 and vice-versa by subtracting. From the forecast available for AODs for the next four years (2013-2017) by using the developed ARIMA model, it is concluded that the forecasted ground AOD has increased trend. Copyright © 2016 Elsevier B.V. All rights reserved.
Efficient model checking of network authentication protocol based on SPIN
NASA Astrophysics Data System (ADS)
Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan
2013-03-01
Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.
Analyzing the causation of a railway accident based on a complex network
NASA Astrophysics Data System (ADS)
Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin
2014-02-01
In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Testing for significance of phase synchronisation dynamics in the EEG.
Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J
2013-06-01
A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.
Testing for nonlinearity in time series: The method of surrogate data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, J.; Galdrikian, B.; Longtin, A.
1991-01-01
We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less
A voice-actuated wind tunnel model leak checking system
NASA Technical Reports Server (NTRS)
Larson, William E.
1989-01-01
A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.
Chambert, Thierry; Rotella, Jay J; Higgs, Megan D
2014-01-01
The investigation of individual heterogeneity in vital rates has recently received growing attention among population ecologists. Individual heterogeneity in wild animal populations has been accounted for and quantified by including individually varying effects in models for mark–recapture data, but the real need for underlying individual effects to account for observed levels of individual variation has recently been questioned by the work of Tuljapurkar et al. (Ecology Letters, 12, 93, 2009) on dynamic heterogeneity. Model-selection approaches based on information criteria or Bayes factors have been used to address this question. Here, we suggest that, in addition to model-selection, model-checking methods can provide additional important insights to tackle this issue, as they allow one to evaluate a model's misfit in terms of ecologically meaningful measures. Specifically, we propose the use of posterior predictive checks to explicitly assess discrepancies between a model and the data, and we explain how to incorporate model checking into the inferential process used to assess the practical implications of ignoring individual heterogeneity. Posterior predictive checking is a straightforward and flexible approach for performing model checks in a Bayesian framework that is based on comparisons of observed data to model-generated replications of the data, where parameter uncertainty is incorporated through use of the posterior distribution. If discrepancy measures are chosen carefully and are relevant to the scientific context, posterior predictive checks can provide important information allowing for more efficient model refinement. We illustrate this approach using analyses of vital rates with long-term mark–recapture data for Weddell seals and emphasize its utility for identifying shortfalls or successes of a model at representing a biological process or pattern of interest. We show how posterior predictive checks can be used to strengthen inferences in ecological studies. We demonstrate the application of this method on analyses dealing with the question of individual reproductive heterogeneity in a population of Antarctic pinnipeds. PMID:24834335
Gilbert, David
2016-01-01
Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems. PMID:27187178
Pârvu, Ovidiu; Gilbert, David
2016-01-01
Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems.
Model Checking Temporal Logic Formulas Using Sticker Automata
Feng, Changwei; Wu, Huanmei
2017-01-01
As an important complex problem, the temporal logic model checking problem is still far from being fully resolved under the circumstance of DNA computing, especially Computation Tree Logic (CTL), Interval Temporal Logic (ITL), and Projection Temporal Logic (PTL), because there is still a lack of approaches for DNA model checking. To address this challenge, a model checking method is proposed for checking the basic formulas in the above three temporal logic types with DNA molecules. First, one-type single-stranded DNA molecules are employed to encode the Finite State Automaton (FSA) model of the given basic formula so that a sticker automaton is obtained. On the other hand, other single-stranded DNA molecules are employed to encode the given system model so that the input strings of the sticker automaton are obtained. Next, a series of biochemical reactions are conducted between the above two types of single-stranded DNA molecules. It can then be decided whether the system satisfies the formula or not. As a result, we have developed a DNA-based approach for checking all the basic formulas of CTL, ITL, and PTL. The simulated results demonstrate the effectiveness of the new method. PMID:29119114
Foundations of the Bandera Abstraction Tools
NASA Technical Reports Server (NTRS)
Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby
2003-01-01
Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.
2013-01-01
Background The UK Department of Health introduced the National Health Service (NHS) Health Check Programme in April 2009 in an attempt to improve primary and secondary prevention of cardiovascular disease in the UK population and to reduce health inequalities. Healthcare professionals' attitudes towards giving lifestyle advice will influence how they interact with patients during consultations. We therefore sought to identify the attitudes of primary care healthcare professionals towards the delivery of lifestyle advice in the context of the NHS Health Check Programme. Methods Fifty-two primary care healthcare professionals undertook a Q sort with 36 statements that represented a range of viewpoints about the importance of lifestyle change, medication, giving lifestyle advice in the primary care setting, and the individual, social and material factors that might impact on lifestyle related behaviour change. Sorts were analysed by-person using principal components analysis and varimax rotation. Results Five statistically independent factors (accounts) reflected distinct views on the topic. Account 1 was supportive of initiatives like the NHS Health Check, and emphasised the importance of professionals working collaboratively with patients to facilitate lifestyle change. Account 2 expressed views on the potential overuse of statin medication and placed responsibility for lifestyle change with the patient. Account 3 viewed the healthcare professional role to be one of educator, emphasising the provision of information. Account 4 perceived lifestyle change to be difficult for patients and emphasised the need for healthcare professionals to be role models. Account 5 was inconsistent about the value of lifestyle change, or the role of healthcare professionals in promoting it, a finding that may be due to ambivalence about the health check or to lack of engagement with the Q sort task. We found no strong associations between any of the factors and, gender, role, age or ethnicity. Conclusions Our findings suggest that healthcare professionals hold viewpoints that may influence how they interact with patients during health checks. When implementing programmes like the NHS Health Check, it would be useful to take healthcare professionals’ views into account. Attitudes and beliefs could be explored during training sessions, for example. PMID:24229342
Check Calibration of the NASA Glenn 10- by 10-Foot Supersonic Wind Tunnel (2014 Test Entry)
NASA Technical Reports Server (NTRS)
Johnson, Aaron; Pastor-Barsi, Christine; Arrington, E. Allen
2016-01-01
A check calibration of the 10- by 10-Foot Supersonic Wind Tunnel (SWT) was conducted in May/June 2014 using an array of five supersonic wedge probes to verify the 1999 Calibration. This check calibration was necessary following a control systems upgrade and an integrated systems test (IST). This check calibration was required to verify the tunnel flow quality was unchanged by the control systems upgrade prior to the next test customer beginning their test entry. The previous check calibration of the tunnel occurred in 2007, prior to the Mars Science Laboratory test program. Secondary objectives of this test entry included the validation of the new Cobra data acquisition system (DAS) against the current Escort DAS and the creation of statistical process control (SPC) charts through the collection of series of repeated test points at certain predetermined tunnel parameters. The SPC charts secondary objective was not completed due to schedule constraints. It is hoped that this effort will be readdressed and completed in the near future.
Full implementation of a distributed hydrological model based on check dam trapped sediment volumes
NASA Astrophysics Data System (ADS)
Bussi, Gianbattista; Francés, Félix
2014-05-01
Lack of hydrometeorological data is one of the most compelling limitations to the implementation of distributed environmental models. Mediterranean catchments, in particular, are characterised by high spatial variability of meteorological phenomena and soil characteristics, which may prevents from transferring model calibrations from a fully gauged catchment to a totally o partially ungauged one. For this reason, new sources of data are required in order to extend the use of distributed models to non-monitored or low-monitored areas. An important source of information regarding the hydrological and sediment cycle is represented by sediment deposits accumulated at the bottom of reservoirs. Since the 60s, reservoir sedimentation volumes were used as proxy data for the estimation of inter-annual total sediment yield rates, or, in more recent years, as a reference measure of the sediment transport for sediment model calibration and validation. Nevertheless, the possibility of using such data for constraining the calibration of a hydrological model has not been exhaustively investigated so far. In this study, the use of nine check dam reservoir sedimentation volumes for hydrological and sedimentological model calibration and spatio-temporal validation was examined. Check dams are common structures in Mediterranean areas, and are a potential source of spatially distributed information regarding both hydrological and sediment cycle. In this case-study, the TETIS hydrological and sediment model was implemented in a medium-size Mediterranean catchment (Rambla del Poyo, Spain) by taking advantage of sediment deposits accumulated behind the check dams located in the catchment headwaters. Reservoir trap efficiency was taken into account by coupling the TETIS model with a pond trap efficiency model. The model was calibrated by adjusting some of its parameters in order to reproduce the total sediment volume accumulated behind a check dam. Then, the model was spatially validated by obtaining the simulated sedimentation volume at the other eight check dams and comparing it to the observed sedimentation volumes. Lastly, the simulated water discharge at the catchment outlet was compared with observed water discharge records in order to check the hydrological sub-model behaviour. Model results provided highly valuable information concerning the spatial distribution of soil erosion and sediment transport. Spatial validation of the sediment sub-model provided very good results at seven check dams out of nine. This study shows that check dams can be a useful tool also for constraining hydrological model calibration, as model results agree with water discharge observations. In fact, the hydrological model validation at a downstream water flow gauge obtained a Nash-Sutcliffe efficiency of 0.8. This technique is applicable to all catchments with presence of check dams, and only requires rainfall and temperature data and soil characteristics maps.
Creating Near-Term Climate Scenarios for AgMIP
NASA Astrophysics Data System (ADS)
Goddard, L.; Greene, A. M.; Baethgen, W.
2012-12-01
For the next assessment report of the IPCC (AR5), attention is being given to development of climate information that is appropriate for adaptation, such as decadal-scale and near-term predictions intended to capture the combined effects of natural climate variability and the emerging climate change signal. While the science and practice evolve for the production and use of dynamic decadal prediction, information relevant to agricultural decision-makers can be gained from analysis of past decadal-scale trends and variability. Statistical approaches that mimic the characteristics of observed year-to-year variability can indicate the range of possibilities and their likelihood. In this talk we present work towards development of near-term climate scenarios, which are needed to engage decision-makers and stakeholders in the regions in current decision-making. The work includes analyses of decadal-scale variability and trends in the AgMIP regions, and statistical approaches that capture year-to-year variability and the associated persistence of wet and dry years. We will outline the general methodology and some of the specific considerations in the regional application of the methodology for different AgMIP regions, such those for Western Africa versus southern Africa. We will also show some examples of quality checks and informational summaries of the generated data, including (1) metrics of information quality such as probabilistic reliability for a suite of relevant climate variables and indices important for agriculture; (2) quality checks relative to the use of this climate data in crop models; and, (3) summary statistics (e.g., for 5-10-year periods or across given spatial scales).
ERIC Educational Resources Information Center
Duy, Joanna; Vaughan, Liwen
2003-01-01
Vendor-provided electronic resource usage statistics are not currently standardized across vendors. This study investigates the feasibility of using locally collected data to check the reliability of vendor-provided data. Vendor-provided data were compared with local data collected from North Carolina State University (NCSU) Libraries' Web…
Generalized Hurst exponent estimates differentiate EEG signals of healthy and epileptic patients
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
2018-01-01
The aim of our current study is to check whether multifractal patterns of the electroencephalographic (EEG) signals of normal and epileptic patients are statistically similar or different. In this regard, the generalized Hurst exponent (GHE) method is used for robust estimation of the multifractals in each type of EEG signals, and three powerful statistical tests are performed to check existence of differences between estimated GHEs from healthy control subjects and epileptic patients. The obtained results show that multifractals exist in both types of EEG signals. Particularly, it was found that the degree of fractal is more pronounced in short variations of normal EEG signals than in short variations of EEG signals with seizure free intervals. In contrary, it is more pronounced in long variations of EEG signals with seizure free intervals than in normal EEG signals. Importantly, both parametric and nonparametric statistical tests show strong evidence that estimated GHEs of normal EEG signals are statistically and significantly different from those with seizure free intervals. Therefore, GHEs can be efficiently used to distinguish between healthy and patients suffering from epilepsy.
Towards Symbolic Model Checking for Multi-Agent Systems via OBDDs
NASA Technical Reports Server (NTRS)
Raimondi, Franco; Lomunscio, Alessio
2004-01-01
We present an algorithm for model checking temporal-epistemic properties of multi-agent systems, expressed in the formalism of interpreted systems. We first introduce a technique for the translation of interpreted systems into boolean formulae, and then present a model-checking algorithm based on this translation. The algorithm is based on OBDD's, as they offer a compact and efficient representation for boolean formulae.
Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system
NASA Astrophysics Data System (ADS)
Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.
2015-03-01
The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS package for observation processing (KPOP) system for data assimilation, preprocessing, and quality control modules for bending-angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. The GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending-angle operator, and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research Community Atmosphere Model with Spectral Element dynamical core (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS local ensemble transform Kalman filter (LETKF) data assimilation system, which has been successfully implemented to a cubed-sphere model with unstructured quadrilateral meshes. As a result of data processing, the bending-angle departure statistics between observation and background show significant improvement. Also, the first experiment in assimilating GPS-RO bending angle from KPOP within KIAPS-LETKF shows encouraging results.
Chirico, Peter G.
2005-01-01
EXPLANATION The purpose of developing a new 10m resolution digital elevation model (DEM) of the Charleston Region was to more accurately depict geologic structure, surfical geology, and landforms of the Charleston County Region. Previously, many areas northeast and southwest of Charleston were originally mapped with a 20 foot contour interval. As a result, large areas within the National Elevation Dataset (NED) depict flat terraced topography where there was a lack of higher resolution elevation data. To overcome these data voids, the new DEM is supplemented with additional elevation data and break-lines derived from aerial photography and topographic maps. The resultant DEM is stored as a raster grid at uniform 10m horizontal resolution. The elevation model contained in this publication was prodcued utilizing the ANUDEM algorthim. ANUDEM allows for the inclusion of contours, streams, rivers, lake and water body polygons as well as spot height data to control the development of the elevation model. A preliminary statistical analysis using over 788 vertical elevation check points, primarily located in the northeastern part of the study area, derived from USGS 7.5 Minute Topographic maps reveals that the final DEM, has a vertical accuracy of ?3.27 meters. A table listing the elevation comparison between the elevation check points and the final DEM is provided.
Jet Noise Diagnostics Supporting Statistical Noise Prediction Methods
NASA Technical Reports Server (NTRS)
Bridges, James E.
2006-01-01
The primary focus of my presentation is the development of the jet noise prediction code JeNo with most examples coming from the experimental work that drove the theoretical development and validation. JeNo is a statistical jet noise prediction code, based upon the Lilley acoustic analogy. Our approach uses time-average 2-D or 3-D mean and turbulent statistics of the flow as input. The output is source distributions and spectral directivity. NASA has been investing in development of statistical jet noise prediction tools because these seem to fit the middle ground that allows enough flexibility and fidelity for jet noise source diagnostics while having reasonable computational requirements. These tools rely on Reynolds-averaged Navier-Stokes (RANS) computational fluid dynamics (CFD) solutions as input for computing far-field spectral directivity using an acoustic analogy. There are many ways acoustic analogies can be created, each with a series of assumptions and models, many often taken unknowingly. And the resulting prediction can be easily reverse-engineered by altering the models contained within. However, only an approach which is mathematically sound, with assumptions validated and modeled quantities checked against direct measurement will give consistently correct answers. Many quantities are modeled in acoustic analogies precisely because they have been impossible to measure or calculate, making this requirement a difficult task. The NASA team has spent considerable effort identifying all the assumptions and models used to take the Navier-Stokes equations to the point of a statistical calculation via an acoustic analogy very similar to that proposed by Lilley. Assumptions have been identified and experiments have been developed to test these assumptions. In some cases this has resulted in assumptions being changed. Beginning with the CFD used as input to the acoustic analogy, models for turbulence closure used in RANS CFD codes have been explored and compared against measurements of mean and rms velocity statistics over a range of jet speeds and temperatures. Models for flow parameters used in the acoustic analogy, most notably the space-time correlations of velocity, have been compared against direct measurements, and modified to better fit the observed data. These measurements have been extremely challenging for hot, high speed jets, and represent a sizeable investment in instrumentation development. As an intermediate check that the analysis is predicting the physics intended, phased arrays have been employed to measure source distributions for a wide range of jet cases. And finally, careful far-field spectral directivity measurements have been taken for final validation of the prediction code. Examples of each of these experimental efforts will be presented. The main result of these efforts is a noise prediction code, named JeNo, which is in middevelopment. JeNo is able to consistently predict spectral directivity, including aft angle directivity, for subsonic cold jets of most geometries. Current development on JeNo is focused on extending its capability to hot jets, requiring inclusion of a previously neglected second source associated with thermal fluctuations. A secondary result of the intensive experimentation is the archiving of various flow statistics applicable to other acoustic analogies and to development of time-resolved prediction methods. These will be of lasting value as we look ahead at future challenges to the aeroacoustic experimentalist.
Harrison, Jay M; Breeze, Matthew L; Harrigan, George G
2011-08-01
Statistical comparisons of compositional data generated on genetically modified (GM) crops and their near-isogenic conventional (non-GM) counterparts typically rely on classical significance testing. This manuscript presents an introduction to Bayesian methods for compositional analysis along with recommendations for model validation. The approach is illustrated using protein and fat data from two herbicide tolerant GM soybeans (MON87708 and MON87708×MON89788) and a conventional comparator grown in the US in 2008 and 2009. Guidelines recommended by the US Food and Drug Administration (FDA) in conducting Bayesian analyses of clinical studies on medical devices were followed. This study is the first Bayesian approach to GM and non-GM compositional comparisons. The evaluation presented here supports a conclusion that a Bayesian approach to analyzing compositional data can provide meaningful and interpretable results. We further describe the importance of method validation and approaches to model checking if Bayesian approaches to compositional data analysis are to be considered viable by scientists involved in GM research and regulation. Copyright © 2011 Elsevier Inc. All rights reserved.
Compositional schedulability analysis of real-time actor-based systems.
Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan
2017-01-01
We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.
NASA Astrophysics Data System (ADS)
Mori, Kaya; Chonko, James C.; Hailey, Charles J.
2005-10-01
We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.
ANSYS duplicate finite-element checker routine
NASA Technical Reports Server (NTRS)
Ortega, R.
1995-01-01
An ANSYS finite-element code routine to check for duplicated elements within the volume of a three-dimensional (3D) finite-element mesh was developed. The routine developed is used for checking floating elements within a mesh, identically duplicated elements, and intersecting elements with a common face. A space shuttle main engine alternate turbopump development high pressure oxidizer turbopump finite-element model check using the developed subroutine is discussed. Finally, recommendations are provided for duplicate element checking of 3D finite-element models.
A Possible Tool for Checking Errors in the INAA Results, Based on Neutron Data and Method Validation
NASA Astrophysics Data System (ADS)
Cincu, Em.; Grigore, Ioana Manea; Barbos, D.; Cazan, I. L.; Manu, V.
2008-08-01
This work presents preliminary results of a new type of possible application in the INAA experiments of elemental analysis, useful to check errors occurred during investigation of unknown samples; it relies on the INAA method validation experiments and accuracy of the neutron data from the literature. The paper comprises 2 sections, the first one presents—in short—the steps of the experimental tests carried out for INAA method validation and for establishing the `ACTIVA-N' laboratory performance, which is-at the same time-an illustration of the laboratory evolution on the way to get performance. Section 2 presents our recent INAA results on CRMs, of which interpretation opens discussions about the usefulness of using a tool for checking possible errors, different from the usual statistical procedures. The questionable aspects and the requirements to develop a practical checking tool are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montano, Joshua Daniel
2015-03-23
Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length. Unfortunately, several nonconformance reports have been generated to document the discovery of a certified machine found out of tolerance during a calibration closeout. In an effort to reduce risk to product quality two solutions were proposed – shorten the calibration cycle which could be costly, or perform an interim check to monitor the machine’s performance between cycles. The CMM interimmore » check discussed makes use of Renishaw’s Machine Checking Gauge. This off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. Data was gathered, analyzed, and simulated from seven machines in seventeen different configurations to create statistical process control run charts for on-the-floor monitoring.« less
Modelling of End Milling of AA6061-TiCp Metal Matrix Composite
NASA Astrophysics Data System (ADS)
Vijay Kumar, S.; Cheepu, Muralimohan; Venkateswarlu, D.; Asohan, P.; Senthil Kumar, V.
2018-03-01
The metal-matrix composites (MMCs) are used in various applications hence lot of research has been carried out on MMCs. To increase the properties of Albased MMCs many ceramic reinforcements have been identified, among which TiC is played vital role because of its properties like high hardness, stiffness and wear resistance. In the present work, a neural network and statistical modelling approach is going to use for the prediction of surface roughness (Ra) and cutting forces in computerised numerical control milling machine. Experiments conducted on a CNC milling machine based on the full factorial design and resulted data used to train and checking the network performance. The sample prepared from in-situ technique and heat treated to get uniform properties. The ANN model has shown satisfactory performance comparatively.
Modelling of End Milling of AA6061-TiCp Metal Matrix Composite
NASA Astrophysics Data System (ADS)
Vijay Kumar, S.; Cheepu, Muralimohan; Venkateswarlu, D.; Asohan, P.; Senthil Kumar, V.
2018-03-01
The metal-matrix composites (MMCs) are used in various applications hence lot of research has been carried out on MMCs. To increase the properties of Al-based MMCs many ceramic reinforcements have been identified, among which TiC is played vital role because of its properties like high hardness, stiffness and wear resistance. In the present work, a neural network and statistical modelling approach is going to use for the prediction of surface roughness (Ra) and cutting forces in computerised numerical control milling machine. Experiments conducted on a CNC milling machine based on the full factorial design and resulted data used to train and checking the network performance. The sample prepared from in-situ technique and heat treated to get uniform properties. The ANN model has shown satisfactory performance comparatively.
The DD Check App for prevention and control of digital dermatitis in dairy herds.
Tremblay, Marlène; Bennett, Tom; Döpfer, Dörte
2016-09-15
Digital dermatitis (DD) is the most important infectious claw disease in the cattle industry causing outbreaks of lameness. The clinical course of disease can be classified using 5 clinical stages. M-stages represent not only different disease severities but also unique clinical characteristics and outcomes. Monitoring the proportions of cows per M-stage is needed to better understand and address DD and factors influencing risks of DD in a herd. Changes in the proportion of cows per M-stage over time or between groups may be attributed to differences in management, environment, or treatment and can have impact on the future claw health of the herd. Yet trends in claw health regarding DD are not intuitively noticed without statistical analysis of detailed records. Our specific aim was to develop a mobile application (app) for persons with less statistical training, experience or supporting programs that would standardize M-stage records, automate data analysis including trends of M-stages over time, the calculation of predictions and assignments of Cow Types (i.e., Cow Types I-III are assigned to cows without active lesions, single and repeated cases of active DD lesions, respectively). The predictions were the stationary distributions of transitions between DD states (i.e., M-stages or signs of chronicity) in a class-structured multi-state Markov chain population model commonly used to model endemic diseases. We hypothesized that the app can be used at different levels of record detail to discover significant trends in the prevalence of M-stages that help to make informed decisions to prevent and control DD on-farm. Four data sets were used to test the flexibility and value of the DD Check App. The app allows easy recording of M-stages in different environments and is flexible in terms of the users' goals and the level of detail used. Results show that this tool discovers trends in M-stage proportions, predicts potential outbreaks of DD, and makes comparisons among Cow Types, signs of chronicity, scorers or pens. The DD Check App also provides a list of cows that should be treated augmented by individual Cow Types to help guide treatment and determine prognoses. Producers can be proactive instead of reactive in controlling DD in a herd by using this app. The DD Check App serves as an example of how technology makes knowledge and advice of veterinary epidemiology widely available to monitor, control and prevent this complex disease. Copyright © 2016 Elsevier B.V. All rights reserved.
Pharmacist and Technician Perceptions of Tech-Check-Tech in Community Pharmacy Practice Settings.
Frost, Timothy P; Adams, Alex J
2018-04-01
Tech-check-tech (TCT) is a practice model in which pharmacy technicians with advanced training can perform final verification of prescriptions that have been previously reviewed for appropriateness by a pharmacist. Few states have adopted TCT in part because of the common view that this model is controversial among members of the profession. This article aims to summarize the existing research on pharmacist and technician perceptions of community pharmacy-based TCT. A literature review was conducted using MEDLINE (January 1990 to August 2016) and Google Scholar (January 1990 to August 2016) using the terms "tech* and check," "tech-check-tech," "checking technician," and "accuracy checking tech*." Of the 7 studies identified we found general agreement among both pharmacists and technicians that TCT in community pharmacy settings can be safely performed. This agreement persisted in studies of theoretical TCT models and in studies assessing participants in actual community-based TCT models. Pharmacists who had previously worked with a checking technician were generally more favorable toward TCT. Both pharmacists and technicians in community pharmacy settings generally perceived TCT to be safe, in both theoretical surveys and in surveys following actual TCT demonstration projects. These perceptions of safety align well with the actual outcomes achieved from community pharmacy TCT studies.
Derivation and Applicability of Asymptotic Results for Multiple Subtests Person-Fit Statistics
Albers, Casper J.; Meijer, Rob R.; Tendeiro, Jorge N.
2016-01-01
In high-stakes testing, it is important to check the validity of individual test scores. Although a test may, in general, result in valid test scores for most test takers, for some test takers, test scores may not provide a good description of a test taker’s proficiency level. Person-fit statistics have been proposed to check the validity of individual test scores. In this study, the theoretical asymptotic sampling distribution of two person-fit statistics that can be used for tests that consist of multiple subtests is first discussed. Second, simulation study was conducted to investigate the applicability of this asymptotic theory for tests of finite length, in which the correlation between subtests and number of items in the subtests was varied. The authors showed that these distributions provide reasonable approximations, even for tests consisting of subtests of only 10 items each. These results have practical value because researchers do not have to rely on extensive simulation studies to simulate sampling distributions. PMID:29881053
Checking Safety in Technology Education
ERIC Educational Resources Information Center
Gunter, Robert E.
2007-01-01
The Bureau of Labor Statistics (United States Department of Labor, Bureau of Labor Statistics [BLS], n.d.) has shown that workers involved in accidents have little, if any, instruction on the equipment they were using while injured. Keep in mind that instruction on the safe operation of a piece of equipment may take place early in the school year,…
Adding Statistical Machine Translation Adaptation to Computer-Assisted Translation
2013-09-01
are automatically searched and used to suggest possible translations; (2) spell-checkers; (3) glossaries; (4) dictionaries ; (5) alignment and...matching against TMs to propose translations; spell-checking, glossary, and dictionary look-up; support for multiple file formats; regular expressions...on Telecommunications. Tehran, 2012, 822–826. Bertoldi, N.; Federico, M. Domain Adaptation for Statistical Machine Translation with Monolingual
Content, Affective, and Behavioral Challenges to Learning: Students' Experiences Learning Statistics
ERIC Educational Resources Information Center
McGrath, April L.
2014-01-01
This study examined the experiences of and challenges faced by students when completing a statistics course. As part of the requirement for this course, students completed a learning check-in, which consisted of an individual meeting with the instructor to discuss questions and the completion of a learning reflection and study plan. Forty…
ASCS online fault detection and isolation based on an improved MPCA
NASA Astrophysics Data System (ADS)
Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan
2014-09-01
Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.
Benchmarking statistical averaging of spectra with HULLAC
NASA Astrophysics Data System (ADS)
Klapisch, Marcel; Busquet, Michel
2008-11-01
Knowledge of radiative properties of hot plasmas is important for ICF, astrophysics, etc When mid-Z or high-Z elements are present, the spectra are so complex that one commonly uses statistically averaged description of atomic systems [1]. In a recent experiment on Fe[2], performed under controlled conditions, high resolution transmission spectra were obtained. The new version of HULLAC [3] allows the use of the same model with different levels of details/averaging. We will take advantage of this feature to check the effect of averaging with comparison with experiment. [1] A Bar-Shalom, J Oreg, and M Klapisch, J. Quant. Spectros. Rad. Transf. 65, 43 (2000). [2] J. E. Bailey, G. A. Rochau, C. A. Iglesias et al., Phys. Rev. Lett. 99, 265002-4 (2007). [3]. M. Klapisch, M. Busquet, and A. Bar-Shalom, AIP Conference Proceedings 926, 206-15 (2007).
Identifying fMRI Model Violations with Lagrange Multiplier Tests
Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor
2013-01-01
The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665
Alsulami, Zayed; Choonara, Imti; Conroy, Sharon
2014-06-01
To evaluate how closely double-checking policies are followed by nurses in paediatric areas and also to identify the types, frequency and rates of medication administration errors that occur despite the double-checking process. Double-checking by two nurses is an intervention used in many UK hospitals to prevent or reduce medication administration errors. There is, however, insufficient evidence to either support or refute the practice of double-checking in terms of medication error risk reduction. Prospective observational study. This was a prospective observational study of paediatric nurses' adherence to the double-checking process for medication administration from April-July 2012. Drug dose administration events (n = 2000) were observed. Independent drug dose calculation, rate of administering intravenous bolus drugs and labelling of flush syringes were the steps with lowest adherence rates. Drug dose calculation was only double-checked independently in 591 (30%) drug administrations. There was a statistically significant difference in nurses' adherence rate to the double-checking steps between weekdays and weekends in nine of the 15 evaluated steps. Medication administration errors (n = 191) or deviations from policy were observed, at a rate of 9·6% of drug administrations. These included 64 drug doses, which were left for parents to administer without nurse observation. There was variation between paediatric nurses' adherence to double-checking steps during medication administration. The most frequent type of administration errors or deviation from policy involved the medicine being given to the parents to administer to the child when the nurse was not present. © 2013 John Wiley & Sons Ltd.
Model Checking the Remote Agent Planner
NASA Technical Reports Server (NTRS)
Khatib, Lina; Muscettola, Nicola; Havelund, Klaus; Norvig, Peter (Technical Monitor)
2001-01-01
This work tackles the problem of using Model Checking for the purpose of verifying the HSTS (Scheduling Testbed System) planning system. HSTS is the planner and scheduler of the remote agent autonomous control system deployed in Deep Space One (DS1). Model Checking allows for the verification of domain models as well as planning entries. We have chosen the real-time model checker UPPAAL for this work. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a sketch for the mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify.
Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration
NASA Technical Reports Server (NTRS)
Groce, Alex; Joshi, Rajeev
2008-01-01
Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, S; Wu, Y; Chang, X
Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans ofmore » the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the Agency for Healthcare Research and Quality (AHRQ) under award 1R01HS0222888. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less
CheckMATE 2: From the model to the limit
NASA Astrophysics Data System (ADS)
Dercks, Daniel; Desai, Nishita; Kim, Jong Soo; Rolbiecki, Krzysztof; Tattersall, Jamie; Weber, Torsten
2017-12-01
We present the latest developments to the CheckMATE program that allows models of new physics to be easily tested against the recent LHC data. To achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now integrates the Monte Carlo event generation via MadGraph5_aMC@NLO and Pythia 8. This allows users to go directly from a SLHA file or UFO model to the result of whether a model is allowed or not. In addition, the integration of the event generation leads to a significant increase in the speed of the program. Many other improvements have also been made, including the possibility to now combine signal regions to give a total likelihood for a model.
Coverage Metrics for Model Checking
NASA Technical Reports Server (NTRS)
Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)
2001-01-01
When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.
Haslinger, Robert; Pipa, Gordon; Brown, Emery
2010-10-01
One approach for understanding the encoding of information by spike trains is to fit statistical models and then test their goodness of fit. The time-rescaling theorem provides a goodness-of-fit test consistent with the point process nature of spike trains. The interspike intervals (ISIs) are rescaled (as a function of the model's spike probability) to be independent and exponentially distributed if the model is accurate. A Kolmogorov-Smirnov (KS) test between the rescaled ISIs and the exponential distribution is then used to check goodness of fit. This rescaling relies on assumptions of continuously defined time and instantaneous events. However, spikes have finite width, and statistical models of spike trains almost always discretize time into bins. Here we demonstrate that finite temporal resolution of discrete time models prevents their rescaled ISIs from being exponentially distributed. Poor goodness of fit may be erroneously indicated even if the model is exactly correct. We present two adaptations of the time-rescaling theorem to discrete time models. In the first we propose that instead of assuming the rescaled times to be exponential, the reference distribution be estimated through direct simulation by the fitted model. In the second, we prove a discrete time version of the time-rescaling theorem that analytically corrects for the effects of finite resolution. This allows us to define a rescaled time that is exponentially distributed, even at arbitrary temporal discretizations. We demonstrate the efficacy of both techniques by fitting generalized linear models to both simulated spike trains and spike trains recorded experimentally in monkey V1 cortex. Both techniques give nearly identical results, reducing the false-positive rate of the KS test and greatly increasing the reliability of model evaluation based on the time-rescaling theorem.
Posterior Predictive Model Checking in Bayesian Networks
ERIC Educational Resources Information Center
Crawford, Aaron
2014-01-01
This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…
Analyzing the cost of screening selectee and non-selectee baggage.
Virta, Julie L; Jacobson, Sheldon H; Kobza, John E
2003-10-01
Determining how to effectively operate security devices is as important to overall system performance as developing more sensitive security devices. In light of recent federal mandates for 100% screening of all checked baggage, this research studies the trade-offs between screening only selectee checked baggage and screening both selectee and non-selectee checked baggage for a single baggage screening security device deployed at an airport. This trade-off is represented using a cost model that incorporates the cost of the baggage screening security device, the volume of checked baggage processed through the device, and the outcomes that occur when the device is used. The cost model captures the cost of deploying, maintaining, and operating a single baggage screening security device over a one-year period. The study concludes that as excess baggage screening capacity is used to screen non-selectee checked bags, the expected annual cost increases, the expected annual cost per checked bag screened decreases, and the expected annual cost per expected number of threats detected in the checked bags screened increases. These results indicate that the marginal increase in security per dollar spent is significantly lower when non-selectee checked bags are screened than when only selectee checked bags are screened.
Organ Donation and Transplantation Statistics
... NKF Online Communities Featured Story Want to save money on your prescriptions? Introducing a new discount card ... Events and Galas Spring Clinical Meetings KEEP Healthy - Free Kidney Health checks Your Kidneys and You Meetings ...
40 CFR 86.327-79 - Quench checks; NOX analyzer.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...
40 CFR 86.327-79 - Quench checks; NOX analyzer.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...
Prediction of dimethyl disulfide levels from biosolids using statistical modeling.
Gabriel, Steven A; Vilalai, Sirapong; Arispe, Susanna; Kim, Hyunook; McConnell, Laura L; Torrents, Alba; Peot, Christopher; Ramirez, Mark
2005-01-01
Two statistical models were used to predict the concentration of dimethyl disulfide (DMDS) released from biosolids produced by an advanced wastewater treatment plant (WWTP) located in Washington, DC, USA. The plant concentrates sludge from primary sedimentation basins in gravity thickeners (GT) and sludge from secondary sedimentation basins in dissolved air flotation (DAF) thickeners. The thickened sludge is pumped into blending tanks and then fed into centrifuges for dewatering. The dewatered sludge is then conditioned with lime before trucking out from the plant. DMDS, along with other volatile sulfur and nitrogen-containing chemicals, is known to contribute to biosolids odors. These models identified oxidation/reduction potential (ORP) values of a GT and DAF, the amount of sludge dewatered by centrifuges, and the blend ratio between GT thickened sludge and DAF thickened sludge in blending tanks as control variables. The accuracy of the developed regression models was evaluated by checking the adjusted R2 of the regression as well as the signs of coefficients associated with each variable. In general, both models explained observed DMDS levels in sludge headspace samples. The adjusted R2 value of the regression models 1 and 2 were 0.79 and 0.77, respectively. Coefficients for each regression model also had the correct sign. Using the developed models, plant operators can adjust the controllable variables to proactively decrease this odorant. Therefore, these models are a useful tool in biosolids management at WWTPs.
Haslinger, Robert; Pipa, Gordon; Brown, Emery
2010-01-01
One approach for understanding the encoding of information by spike trains is to fit statistical models and then test their goodness of fit. The time rescaling theorem provides a goodness of fit test consistent with the point process nature of spike trains. The interspike intervals (ISIs) are rescaled (as a function of the model’s spike probability) to be independent and exponentially distributed if the model is accurate. A Kolmogorov Smirnov (KS) test between the rescaled ISIs and the exponential distribution is then used to check goodness of fit. This rescaling relies upon assumptions of continuously defined time and instantaneous events. However spikes have finite width and statistical models of spike trains almost always discretize time into bins. Here we demonstrate that finite temporal resolution of discrete time models prevents their rescaled ISIs from being exponentially distributed. Poor goodness of fit may be erroneously indicated even if the model is exactly correct. We present two adaptations of the time rescaling theorem to discrete time models. In the first we propose that instead of assuming the rescaled times to be exponential, the reference distribution be estimated through direct simulation by the fitted model. In the second, we prove a discrete time version of the time rescaling theorem which analytically corrects for the effects of finite resolution. This allows us to define a rescaled time which is exponentially distributed, even at arbitrary temporal discretizations. We demonstrate the efficacy of both techniques by fitting Generalized Linear Models (GLMs) to both simulated spike trains and spike trains recorded experimentally in monkey V1 cortex. Both techniques give nearly identical results, reducing the false positive rate of the KS test and greatly increasing the reliability of model evaluation based upon the time rescaling theorem. PMID:20608868
Ophthalmologist in patients' eyes.
Derk, Biljana Andrijević; Dapić, Natasa Kovac; Milinković, Branko; Loncar, Valentina Lacmanović; Mijić, Vesna
2005-01-01
It seems that patient's knowledge about ophthalmologist's work is very insufficient, especially about what type of examination should be undertaken for refraction condition during the "simple" eye check-up and which serious systemic diseases could be discovered thorough eye examinations. The aim of the study was to determine patients' knowledge about ophthalmologist examinations during the check-up for refraction condition, knowledge about differences between ophthalmologists and opticians, main sources of patients' ophthalmologic cognition and the main reasons for coming to special locations. Patients (311) were examined by applying the questionnaire, immediately before the eye check-up at three various refraction units. Statistical analysis used Chi-square test and test of significance between proportions, except for patients' age where Student t-test was used. Differences were statistically significant at p = 0.05. The findings show that the patients' knowledge about eye examination during the check-ups for refraction abnormalities was not satisfactory. Twenty-two percent (22%) of examined patients did not know the differences between ophthalmologists and opticians and 16% believed that after computer testing of refraction further ophthalmologic examinations were not necessary. The main sources of medical cognition were the mass media while twenty percent (20%) of the participating patients named doctor's lectures as the source of their cognition. This study revealed that a lot of work needs to be done to improve patients' education and indirectly for better screening of very serious systemic diseases and blind threatening diseases which could be discovered during the first visit for spectacle prescription.
Rotolo, Federico; Paoletti, Xavier; Michiels, Stefan
2018-03-01
Surrogate endpoints are attractive for use in clinical trials instead of well-established endpoints because of practical convenience. To validate a surrogate endpoint, two important measures can be estimated in a meta-analytic context when individual patient data are available: the R indiv 2 or the Kendall's τ at the individual level, and the R trial 2 at the trial level. We aimed at providing an R implementation of classical and well-established as well as more recent statistical methods for surrogacy assessment with failure time endpoints. We also intended incorporating utilities for model checking and visualization and data generating methods described in the literature to date. In the case of failure time endpoints, the classical approach is based on two steps. First, a Kendall's τ is estimated as measure of individual level surrogacy using a copula model. Then, the R trial 2 is computed via a linear regression of the estimated treatment effects; at this second step, the estimation uncertainty can be accounted for via measurement-error model or via weights. In addition to the classical approach, we recently developed an approach based on bivariate auxiliary Poisson models with individual random effects to measure the Kendall's τ and treatment-by-trial interactions to measure the R trial 2 . The most common data simulation models described in the literature are based on: copula models, mixed proportional hazard models, and mixture of half-normal and exponential random variables. The R package surrosurv implements the classical two-step method with Clayton, Plackett, and Hougaard copulas. It also allows to optionally adjusting the second-step linear regression for measurement-error. The mixed Poisson approach is implemented with different reduced models in addition to the full model. We present the package functions for estimating the surrogacy models, for checking their convergence, for performing leave-one-trial-out cross-validation, and for plotting the results. We illustrate their use in practice on individual patient data from a meta-analysis of 4069 patients with advanced gastric cancer from 20 trials of chemotherapy. The surrosurv package provides an R implementation of classical and recent statistical methods for surrogacy assessment of failure time endpoints. Flexible simulation functions are available to generate data according to the methods described in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.
Valley Fever (Coccidioidomycosis) Statistics
... Valley fever may be under-recognized. 2 , 3 Public health surveillance for Valley fever Valley fever is reportable ... MMWR) . Check with your local, state, or territorial public health department for more information about disease reporting requirements ...
Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking
NASA Technical Reports Server (NTRS)
Turgeon, Gregory; Price, Petra
2010-01-01
A feasibility study was performed on a representative aerospace system to determine the following: (1) the benefits and limitations to using SCADE , a commercially available tool for model checking, in comparison to using a proprietary tool that was studied previously [1] and (2) metrics for performing the model checking and for assessing the findings. This study was performed independently of the development task by a group unfamiliar with the system, providing a fresh, external perspective free from development bias.
NASA Technical Reports Server (NTRS)
Call, Jared A.; Kwok, John H.; Fisher, Forest W.
2013-01-01
This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.
Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking
NASA Technical Reports Server (NTRS)
Cavada, Roberto; Pecheur, Charles
2003-01-01
This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.
Steinfeld, Beate; Bauer, Anika; Waldorf, Manuel; Engel, Nicole; Braks, Karsten; Huber, Thomas J; Vocks, Silja
2017-01-01
Body-related checking behavior, as a behavioral manifestation of a disturbed body image, fosters the development and maintenance of eating disorders. The Body Checking Questionnaire (BCQ) is the most commonly used questionnaire for measuring body-related checking behavior internationally. To date, validation studies are only available for adult populations. Therefore, the aim of this study was to statistically test the German-language version of the BCQ in adolescents. A total of N=129 female adolescents were examined, comprising n=57 with Anorexia Nervosa, n=24 with Bulimia Nervosa, and n=48 healthy female adolescents. A confirmatory factor analysis supports the subdivision of the BCQ into a general factor and the subfactors "overall appearance", "specific body parts" and "idiosyncratic checking", which was also found in the original version. The internal consistencies are good (α≥0.81), and the BCQ is able to differentiate well between adolescents with and without eating disorders. Significant correlations between the BCQ and other body image questionnaires point to a good convergent validity. The German-language BCQ thus constitutes a valid and reliable instrument for measuring body-related checking behavior among adolescents in clinical research and practice. © Georg Thieme Verlag KG Stuttgart · New York.
Long-term behaviour of timber structures in torrent control
NASA Astrophysics Data System (ADS)
Rickli, Christian; Graf, Frank
2014-05-01
Timber is widely used for protection measures in torrent control. However, life span of woody constructions such as timber check dams is limited due to fungal decay. However, only sparse scientific information is available on the long-term behaviour of timber structures and the colonisation with decay fungi. Related to this, in practice a controversial discussion has been going on if either Norway Spruce (Picea abies) or Silver Fir (Abies alba) is more enduring and if bark removal increases resistance against fungal decay. In order to going into this matter a series of 15 timber check dams built in 1996 has been monitored. The constructions were alternatively realised with Norway Spruce and Silver Fir, half of them each with remaining and removed bark, respectively. The scientific investigations included the documentation of colonisation with rot fungi and the identification of decayed zones with a simple practical approach as well as based on drilling resistance. Colonisation by decay fungi started three years after construction (e.g. Gloeophyllum sepiarium), detecting two years later first parts with reduced wood resistance. Sixteen years after construction decay was found on all check dams but two. Wood quality was markedly better in watered sections compared to the occasionally dry lateral abutment sections. Taking the whole check dams into consideration, slightly more decay was detected in Norway Spruce compared to logs in Silver Fir and both the practical approach and the drilling resistance measurement yielded in more defects on logs without bark. However, due to limited number of replications and fungal data, it was not possible to statistically verify these results. Statistical analysis was restricted to the drilling resistance data and fruit-bodies of decay fungi of the uppermost log of each check dam. Based on this limited analysis significant differences in the effect on the drilling resistance were found for watered sections and lateral abutments, brown and white rot as well as fir with and without bark. Taking further into account that brown rot reduces wood strength faster than white rot, it may be speculated that spruce logs without bark and fir logs with bark are more resistant against fungal decay compared to logs of spruce with and fir without bark, respectively. However, this has to be treated with caution as only the uppermost logs were considered, the observation period was only 15 years and the relative abundance of the most important decay fungi considerably varied between as well as within the check dams. Consequently, for statistically sound and well-funded recommendations further investigations over a longer period are indispensable.
On the validity of time-dependent AUC estimators.
Schmid, Matthias; Kestler, Hans A; Potapov, Sergej
2015-01-01
Recent developments in molecular biology have led to the massive discovery of new marker candidates for the prediction of patient survival. To evaluate the predictive value of these markers, statistical tools for measuring the performance of survival models are needed. We consider estimators of discrimination measures, which are a popular approach to evaluate survival predictions in biomarker studies. Estimators of discrimination measures are usually based on regularity assumptions such as the proportional hazards assumption. Based on two sets of molecular data and a simulation study, we show that violations of the regularity assumptions may lead to over-optimistic estimates of prediction accuracy and may therefore result in biased conclusions regarding the clinical utility of new biomarkers. In particular, we demonstrate that biased medical decision making is possible even if statistical checks indicate that all regularity assumptions are satisfied. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
Incremental checking of Master Data Management model based on contextual graphs
NASA Astrophysics Data System (ADS)
Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan
2015-10-01
The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.
Model Checking - My 27-Year Quest to Overcome the State Explosion Problem
NASA Technical Reports Server (NTRS)
Clarke, Ed
2009-01-01
Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.
Anani, Nadim; Mazya, Michael V; Chen, Rong; Prazeres Moreira, Tiago; Bill, Olivier; Ahmed, Niaz; Wahlgren, Nils; Koch, Sabine
2017-01-10
Interoperability standards intend to standardise health information, clinical practice guidelines intend to standardise care procedures, and patient data registries are vital for monitoring quality of care and for clinical research. This study combines all three: it uses interoperability specifications to model guideline knowledge and applies the result to registry data. We applied the openEHR Guideline Definition Language (GDL) to data from 18,400 European patients in the Safe Implementation of Treatments in Stroke (SITS) registry to retrospectively check their compliance with European recommendations for acute stroke treatment. Comparing compliance rates obtained with GDL to those obtained by conventional statistical data analysis yielded a complete match, suggesting that GDL technology is reliable for guideline compliance checking. The successful application of a standard guideline formalism to a large patient registry dataset is an important step toward widespread implementation of computer-interpretable guidelines in clinical practice and registry-based research. Application of the methodology gave important results on the evolution of stroke care in Europe, important both for quality of care monitoring and clinical research.
Reliability considerations for the total strain range version of strainrange partitioning
NASA Technical Reports Server (NTRS)
Wirsching, P. H.; Wu, Y. T.
1984-01-01
A proposed total strainrange version of strainrange partitioning (SRP) to enhance the manner in which SRP is applied to life prediction is considered with emphasis on how advanced reliability technology can be applied to perform risk analysis and to derive safety check expressions. Uncertainties existing in the design factors associated with life prediction of a component which experiences the combined effects of creep and fatigue can be identified. Examples illustrate how reliability analyses of such a component can be performed when all design factors in the SRP model are random variables reflecting these uncertainties. The Rackwitz-Fiessler and Wu algorithms are used and estimates of the safety index and the probablity of failure are demonstrated for a SRP problem. Methods of analysis of creep-fatigue data with emphasis on procedures for producing synoptic statistics are presented. An attempt to demonstrate the importance of the contribution of the uncertainties associated with small sample sizes (fatique data) to risk estimates is discussed. The procedure for deriving a safety check expression for possible use in a design criteria document is presented.
CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.
Shalizi, Cosma Rohilla; Rinaldo, Alessandro
2013-04-01
The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.
CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS
Shalizi, Cosma Rohilla; Rinaldo, Alessandro
2015-01-01
The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling, or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM’s expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses. PMID:26166910
Marginal evidence for cosmic acceleration from Type Ia supernovae
NASA Astrophysics Data System (ADS)
Nielsen, J. T.; Guffanti, A.; Sarkar, S.
2016-10-01
The ‘standard’ model of cosmology is founded on the basis that the expansion rate of the universe is accelerating at present — as was inferred originally from the Hubble diagram of Type Ia supernovae. There exists now a much bigger database of supernovae so we can perform rigorous statistical tests to check whether these ‘standardisable candles’ indeed indicate cosmic acceleration. Taking account of the empirical procedure by which corrections are made to their absolute magnitudes to allow for the varying shape of the light curve and extinction by dust, we find, rather surprisingly, that the data are still quite consistent with a constant rate of expansion.
Marginal evidence for cosmic acceleration from Type Ia supernovae
Nielsen, J. T.; Guffanti, A.; Sarkar, S.
2016-01-01
The ‘standard’ model of cosmology is founded on the basis that the expansion rate of the universe is accelerating at present — as was inferred originally from the Hubble diagram of Type Ia supernovae. There exists now a much bigger database of supernovae so we can perform rigorous statistical tests to check whether these ‘standardisable candles’ indeed indicate cosmic acceleration. Taking account of the empirical procedure by which corrections are made to their absolute magnitudes to allow for the varying shape of the light curve and extinction by dust, we find, rather surprisingly, that the data are still quite consistent with a constant rate of expansion. PMID:27767125
ERIC Educational Resources Information Center
Hoijtink, Herbert; Molenaar, Ivo W.
1997-01-01
This paper shows that a certain class of constrained latent class models may be interpreted as a special case of nonparametric multidimensional item response models. Parameters of this latent class model are estimated using an application of the Gibbs sampler, and model fit is investigated using posterior predictive checks. (SLD)
NASA Astrophysics Data System (ADS)
Kamaruddin, Ainur Amira; Ali, Zalila; Noor, Norlida Mohd.; Baharum, Adam; Ahmad, Wan Muhamad Amir W.
2014-07-01
Logistic regression analysis examines the influence of various factors on a dichotomous outcome by estimating the probability of the event's occurrence. Logistic regression, also called a logit model, is a statistical procedure used to model dichotomous outcomes. In the logit model the log odds of the dichotomous outcome is modeled as a linear combination of the predictor variables. The log odds ratio in logistic regression provides a description of the probabilistic relationship of the variables and the outcome. In conducting logistic regression, selection procedures are used in selecting important predictor variables, diagnostics are used to check that assumptions are valid which include independence of errors, linearity in the logit for continuous variables, absence of multicollinearity, and lack of strongly influential outliers and a test statistic is calculated to determine the aptness of the model. This study used the binary logistic regression model to investigate overweight and obesity among rural secondary school students on the basis of their demographics profile, medical history, diet and lifestyle. The results indicate that overweight and obesity of students are influenced by obesity in family and the interaction between a student's ethnicity and routine meals intake. The odds of a student being overweight and obese are higher for a student having a family history of obesity and for a non-Malay student who frequently takes routine meals as compared to a Malay student.
Construct validity and reliability of the Single Checking Administration of Medications Scale.
O'Connell, Beverly; Hawkins, Mary; Ockerby, Cherene
2013-06-01
Research indicates that single checking of medications is as safe as double checking; however, many nurses are averse to independently checking medications. To assist with the introduction and use of single checking, a measure of nurses' attitudes, the thirteen-item Single Checking Administration of Medications Scale (SCAMS) was developed. We examined the psychometric properties of the SCAMS. Secondary analyses were conducted on data collected from 503 nurses across a large Australian health-care service. Analyses using exploratory and confirmatory factor analyses supported by structural equation modelling resulted in a valid twelve-item SCAMS containing two reliable subscales, the nine-item Attitudes towards single checking and three-item Advantages of single checking subscales. The SCAMS is recommended as a valid and reliable measure for monitoring nurses' attitudes to single checking prior to introducing single checking medications and after its implementation. © 2013 Wiley Publishing Asia Pty Ltd.
Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru
2009-04-27
Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.
The Priority Inversion Problem and Real-Time Symbolic Model Checking
1993-04-23
real time systems unpredictable in subtle ways. This makes it more difficult to implement and debug such systems. Our work discusses this problem and presents one possible solution. The solution is formalized and verified using temporal logic model checking techniques. In order to perform the verification, the BDD-based symbolic model checking algorithm given in previous works was extended to handle real-time properties using the bounded until operator. We believe that this algorithm, which is based on discrete time, is able to handle many real-time properties
Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea
NASA Astrophysics Data System (ADS)
Jun, K.; Tak, W.; JUN, B. H.; Lee, H. J.; KIM, S. D.
2016-12-01
Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea Kye-Won Jun*, Won-Jun Tak*, Byong-Hee Jun**, Ho-Jin Lee***, Soung-Doug Kim* *Graduate School of Disaster Prevention, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea **School of Fire and Disaster Protection, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea ***School of Civil Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Korea Abstract As more than 64% of the land in South Korea is mountainous area, so many regions in South Korea are exposed to the danger of landslide and debris flow. So it is important to understand the behavior of debris flow in mountainous terrains, the various methods and models are being presented and developed based on the mathematical concept. The purpose of this study is to investigate the regions that experienced the debris flow due to typhoon called Ewiniar and to perform numerical modeling to design and layout of the Check dam for reducing the damage by the debris flow. For the performance of numerical modeling, on-site measurement of the research area was conducted including: topographic investigation, research on bridges in the downstream, and precision LiDAR 3D scanning for composed basic data of numerical modeling. The numerical simulation of this study was performed using RAMMS (Rapid Mass Movements Simulation) model for the analysis of the debris flow. This model applied to the conditions of the Check dam which was installed in the upstream, midstream, and downstream. Considering the reduction effect of debris flow, the expansion of debris flow, and the influence on the bridges in the downstream, proper location of the Check dam was designated. The result of present numerical model showed that when the Check dam was installed in the downstream section, 50 m above the bridge, the reduction effect of the debris flow was higher compared to when the Check dam were installed in other sections. Key words: Debris flow, LiDAR, Check dam, RAMMSAcknowledgementsThis research was supported by a grant [MPSS-NH-2014-74] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government
Model building strategy for logistic regression: purposeful selection.
Zhang, Zhongheng
2016-03-01
Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.
A note on the misuses of the variance test in meteorological studies
NASA Astrophysics Data System (ADS)
Hazra, Arnab; Bhattacharya, Sourabh; Banik, Pabitra; Bhattacharya, Sabyasachi
2017-12-01
Stochastic modeling of rainfall data is an important area in meteorology. The gamma distribution is a widely used probability model for non-zero rainfall. Typically the choice of the distribution for such meteorological studies is based on two goodness-of-fit tests—the Pearson's Chi-square test and the Kolmogorov-Smirnov test. Inspired by the index of dispersion introduced by Fisher (Statistical methods for research workers. Hafner Publishing Company Inc., New York, 1925), Mooley (Mon Weather Rev 101:160-176, 1973) proposed the variance test as a goodness-of-fit measure in this context and a number of researchers have implemented it since then. We show that the asymptotic distribution of the test statistic for the variance test is generally not comparable to any central Chi-square distribution and hence the test is erroneous. We also describe a method for checking the validity of the asymptotic distribution for a class of distributions. We implement the erroneous test on some simulated, as well as real datasets and demonstrate how it leads to some wrong conclusions.
Dynamic scaling in natural swarms
NASA Astrophysics Data System (ADS)
Cavagna, Andrea; Conti, Daniele; Creato, Chiara; Del Castello, Lorenzo; Giardina, Irene; Grigera, Tomas S.; Melillo, Stefania; Parisi, Leonardo; Viale, Massimiliano
2017-09-01
Collective behaviour in biological systems presents theoretical challenges beyond the borders of classical statistical physics. The lack of concepts such as scaling and renormalization is particularly problematic, as it forces us to negotiate details whose relevance is often hard to assess. In an attempt to improve this situation, we present here experimental evidence of the emergence of dynamic scaling laws in natural swarms of midges. We find that spatio-temporal correlation functions in different swarms can be rescaled by using a single characteristic time, which grows with the correlation length with a dynamical critical exponent z ~ 1, a value not found in any other standard statistical model. To check whether out-of-equilibrium effects may be responsible for this anomalous exponent, we run simulations of the simplest model of self-propelled particles and find z ~ 2, suggesting that natural swarms belong to a novel dynamic universality class. This conclusion is strengthened by experimental evidence of the presence of non-dissipative modes in the relaxation, indicating that previously overlooked inertial effects are needed to describe swarm dynamics. The absence of a purely dissipative regime suggests that natural swarms undergo a near-critical censorship of hydrodynamics.
HiVy automated translation of stateflow designs for model checking verification
NASA Technical Reports Server (NTRS)
Pingree, Paula
2003-01-01
tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.
Fast maximum likelihood estimation using continuous-time neural point process models.
Lepage, Kyle Q; MacDonald, Christopher J
2015-06-01
A recent report estimates that the number of simultaneously recorded neurons is growing exponentially. A commonly employed statistical paradigm using discrete-time point process models of neural activity involves the computation of a maximum-likelihood estimate. The time to computate this estimate, per neuron, is proportional to the number of bins in a finely spaced discretization of time. By using continuous-time models of neural activity and the optimally efficient Gaussian quadrature, memory requirements and computation times are dramatically decreased in the commonly encountered situation where the number of parameters p is much less than the number of time-bins n. In this regime, with q equal to the quadrature order, memory requirements are decreased from O(np) to O(qp), and the number of floating-point operations are decreased from O(np(2)) to O(qp(2)). Accuracy of the proposed estimates is assessed based upon physiological consideration, error bounds, and mathematical results describing the relation between numerical integration error and numerical error affecting both parameter estimates and the observed Fisher information. A check is provided which is used to adapt the order of numerical integration. The procedure is verified in simulation and for hippocampal recordings. It is found that in 95 % of hippocampal recordings a q of 60 yields numerical error negligible with respect to parameter estimate standard error. Statistical inference using the proposed methodology is a fast and convenient alternative to statistical inference performed using a discrete-time point process model of neural activity. It enables the employment of the statistical methodology available with discrete-time inference, but is faster, uses less memory, and avoids any error due to discretization.
Razzaq, Misbah; Ahmad, Jamil
2015-01-01
Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework. PMID:26713449
Razzaq, Misbah; Ahmad, Jamil
2015-01-01
Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.
Kraus, Nicole; Lindenberg, Julia; Zeeck, Almut; Kosfelder, Joachim; Vocks, Silja
2015-09-01
Cognitive-behavioural models of eating disorders state that body checking arises in response to negative emotions in order to reduce the aversive emotional state and is therefore negatively reinforced. This study empirically tests this assumption. For a seven-day period, women with eating disorders (n = 26) and healthy controls (n = 29) were provided with a handheld computer for assessing occurring body checking strategies as well as negative and positive emotions. Serving as control condition, randomized computer-emitted acoustic signals prompted reports on body checking and emotions. There was no difference in the intensity of negative emotions before body checking and in control situations across groups. However, from pre- to post-body checking, an increase in negative emotions was found. This effect was more pronounced in women with eating disorders compared with healthy controls. Results are contradictory to the assumptions of the cognitive-behavioural model, as body checking does not seem to reduce negative emotions. Copyright © 2015 John Wiley & Sons, Ltd and Eating Disorders Association.
Experimental study of the γ p → π 0 η p reaction with the A2 setup at the Mainz Microtron
Sokhoyan, V.; Prakhov, S.; Fix, A.; ...
2018-05-01
The data available from the A2 Collaboration at MAMI were analyzed to select the γp → π0ηp reaction on an event-by-event basis, which allows for partial-wave analyses of three-body final states to obtain more reliable results, compared to fits to measured distributions. These data provide the world’s best statistical accuracy in the energy range from threshold to Eγ = 1.45 GeV, allowing a finer energy binning in the measurement of all observables needed for understanding the reaction dynamics.The results obtained for themeasured observables are compared to existing models, and the impact from the new data is checked by the fitmore » with the revised Mainz model.« less
Microscopic calculations of liquid and solid neutron star matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakravarty, Sudip; Miller, Michael D.; Chia-Wei, Woo
1974-02-01
As the first step to a microscopic determination of the solidification density of neutron star matter, variational calculations are performed for both liquid and solid phases using a very simple model potential. The potential, containing only the repulsive part of the Reid /sup 1/S/sub o/ interaction, together with Boltzmann statistics defines a homework problem'' which several groups involved in solidification calculations have agreed to solve. The results were to be compared for the purpose of checking calculational techniques. For the solid energy good agreement with Canuto and Chitre was found. Both the liquid and solid energies are much lower thanmore » those of Pandharipande. It is shown that for this oversimplified model, neutron star matter will remain solid down to ordinary nuclear matter density.« less
Experimental study of the γ p →π0η p reaction with the A2 setup at the Mainz Microtron
NASA Astrophysics Data System (ADS)
Sokhoyan, V.; Prakhov, S.; Fix, A.; Abt, S.; Achenbach, P.; Adlarson, P.; Afzal, F.; Aguar-Bartolomé, P.; Ahmed, Z.; Ahrens, J.; Annand, J. R. M.; Arends, H. J.; Bantawa, K.; Bashkanov, M.; Beck, R.; Biroth, M.; Borisov, N. S.; Braghieri, A.; Briscoe, W. J.; Cherepnya, S.; Cividini, F.; Collicott, C.; Costanza, S.; Denig, A.; Dieterle, M.; Downie, E. J.; Drexler, P.; Ferretti Bondy, M. I.; Fil'kov, L. V.; Gardner, S.; Garni, S.; Glazier, D. I.; Gorodnov, I.; Gradl, W.; Günther, M.; Gurevich, G. M.; Hamill, C. B.; Heijkenskjöld, L.; Hornidge, D.; Huber, G. M.; Käser, A.; Kashevarov, V. L.; Kay, S.; Keshelashvili, I.; Kondratiev, R.; Korolija, M.; Krusche, B.; Lazarev, A.; Lisin, V.; Livingston, K.; Lutterer, S.; MacGregor, I. J. D.; Manley, D. M.; Martel, P. P.; McGeorge, J. C.; Middleton, D. G.; Miskimen, R.; Mornacchi, E.; Mushkarenkov, A.; Neganov, A.; Neiser, A.; Oberle, M.; Ostrick, M.; Otte, P. B.; Paudyal, D.; Pedroni, P.; Polonski, A.; Ron, G.; Rostomyan, T.; Sarty, A.; Sfienti, C.; Spieker, K.; Steffen, O.; Strakovsky, I. I.; Strandberg, B.; Strub, Th.; Supek, I.; Thiel, A.; Thiel, M.; Thomas, A.; Unverzagt, M.; Usov, Yu. A.; Wagner, S.; Walford, N. K.; Watts, D. P.; Werthmüller, D.; Wettig, J.; Witthauer, L.; Wolfes, M.; Zana, L. A.; A2 Collaboration at MAMI
2018-05-01
The data available from the A2 Collaboration at MAMI were analyzed to select the γ p →π0η p reaction on an event-by-event basis, which allows for partial-wave analyses of three-body final states to obtain more reliable results, compared to fits to measured distributions. These data provide the world's best statistical accuracy in the energy range from threshold to Eγ=1.45 GeV, allowing a finer energy binning in the measurement of all observables needed for understanding the reaction dynamics. The results obtained for the measured observables are compared to existing models, and the impact from the new data is checked by the fit with the revised Mainz model.
Spatial and Temporal scales of time-averaged 700 MB height anomalies
NASA Technical Reports Server (NTRS)
Gutzler, D.
1981-01-01
The monthly and seasonal forecasting technique is based to a large extent on the extrapolation of trends in the positions of the centers of time averaged geopotential height anomalies. The complete forecasted height pattern is subsequently drawn around the forecasted anomaly centers. The efficacy of this technique was tested and time series of observed monthly mean and 5 day mean 700 mb geopotential heights were examined. Autocorrelation statistics are generated to document the tendency for persistence of anomalies. These statistics are compared to a red noise hypothesis to check for evidence of possible preferred time scales of persistence. Space-time spectral analyses at middle latitudes are checked for evidence of periodicities which could be associated with predictable month-to-month trends. A local measure of the average spatial scale of anomalies is devised for guidance in the completion of the anomaly pattern around the forecasted centers.
Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.
Durdu, Omer Faruk
2010-10-01
In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic statistics of observed data in terms of mean. The ARIMA modeling approach is recommended for predicting boron concentration series of a river.
Updraft Model for Development of Autonomous Soaring Uninhabited Air Vehicles
NASA Technical Reports Server (NTRS)
Allen, Michael J.
2006-01-01
Large birds and glider pilots commonly use updrafts caused by convection in the lower atmosphere to extend flight duration, increase cross-country speed, improve range, or simply to conserve energy. Uninhabited air vehicles may also have the ability to exploit updrafts to improve performance. An updraft model was developed at NASA Dryden Flight Research Center (Edwards, California) to investigate the use of convective lift for uninhabited air vehicles in desert regions. Balloon and surface measurements obtained at the National Oceanic and Atmospheric Administration Surface Radiation station (Desert Rock, Nevada) enabled the model development. The data were used to create a statistical representation of the convective velocity scale, w*, and the convective mixing-layer thickness, zi. These parameters were then used to determine updraft size, vertical velocity profile, spacing, and maximum height. This paper gives a complete description of the updraft model and its derivation. Computer code for running the model is also given in conjunction with a check case for model verification.
Model Diagnostics for Bayesian Networks
ERIC Educational Resources Information Center
Sinharay, Sandip
2006-01-01
Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…
Role of the check dam in land development on the Loess Plateau, China
NASA Astrophysics Data System (ADS)
Xu, Xiang-Zhou; Zhang, Luo-Hao; Zhu, Tongxin; Dang, Tian-Min; Zhang, Hong-Wu; Xu, Shi-Guo
2017-04-01
Check dam is one of the most effective measures to reduce flow connectivity, which can retain soil and water, and increase land productivity. More than 100,000 check dams have been built on the Loess Plateau since 1950s. However, quantifying the effect of check dams on water resources and water environments remains a challenge. In this study, an in-depth field investigation together with a credible statistical analysis was carried out in two representative catchments on the Loess Plateau, Nanxiaohegou Catchment and Jiuyuangou Catchment, to assess the effectiveness of check dams in soil, water and nutrients conservation. The results show: (1) Check dam plays an important role in conserving water, soil, and nutrients on the Loess Plateau. About half of the total transported water and more than 80 % of the total transported soil and nutrients, had been locally retained in the selected catchments. Hence check dams had a significant benefit to improve soil fertility in the small watersheds, and reducing water pollution downstream of dams. (2) Compared to terrace farmlands, forest lands and grasslands, check-dam lands were much more important in conserving water, soil and nutrients in the catchments. Nearly 50% of the reduced water and more than 70% of the stored soil and nutrients in the study catchments were solely retained by the check dams, whereas the area of the dam lands was less than 7% of the total conservation land area. (3) Check dams are still effective in large storms even if dams were damaged by floods. It is often assumed that check dams could only retain sediment in small flood events whereas most of the stored soil may be washed out as the dams may be destroyed in a disastrous flood. Furthermore, if a major check dam, namely the key project dam, was built in the gully outlet, the flood could be controlled, and thereupon the dam-break can be also avoided. We suggest that a compensation and incentive policy be implemented on dam building to realize the sustainable development of local economy and ecological environment.
Sullivan, Kristynn J; Shadish, William R; Steiner, Peter M
2015-03-01
Single-case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time in both the presence and absence of treatment. This article introduces a statistical technique for analyzing SCD data that has not been much used in psychological and educational research: generalized additive models (GAMs). In parametric regression, the researcher must choose a functional form to impose on the data, for example, that trend over time is linear. GAMs reverse this process by letting the data inform the choice of functional form. In this article we review the problem that trend poses in SCDs, discuss how current SCD analytic methods approach trend, describe GAMs as a possible solution, suggest a GAM model testing procedure for examining the presence of trend in SCDs, present a small simulation to show the statistical properties of GAMs, and illustrate the procedure on 3 examples of different lengths. Results suggest that GAMs may be very useful both as a form of sensitivity analysis for checking the plausibility of assumptions about trend and as a primary data analysis strategy for testing treatment effects. We conclude with a discussion of some problems with GAMs and some future directions for research on the application of GAMs to SCDs. (c) 2015 APA, all rights reserved).
Eagle, Dawn M.; Noschang, Cristie; d’Angelo, Laure-Sophie Camilla; Noble, Christie A.; Day, Jacob O.; Dongelmans, Marie Louise; Theobald, David E.; Mar, Adam C.; Urcelay, Gonzalo P.; Morein-Zamir, Sharon; Robbins, Trevor W.
2014-01-01
Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an ‘observing’ lever for information about the location of an ‘active’ lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5 mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. PMID:24406720
2017-03-01
models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems
Review of Statistical Methods for Analysing Healthcare Resources and Costs
Mihaylova, Borislava; Briggs, Andrew; O'Hagan, Anthony; Thompson, Simon G
2011-01-01
We review statistical methods for analysing healthcare resource use and costs, their ability to address skewness, excess zeros, multimodality and heavy right tails, and their ease for general use. We aim to provide guidance on analysing resource use and costs focusing on randomised trials, although methods often have wider applicability. Twelve broad categories of methods were identified: (I) methods based on the normal distribution, (II) methods following transformation of data, (III) single-distribution generalized linear models (GLMs), (IV) parametric models based on skewed distributions outside the GLM family, (V) models based on mixtures of parametric distributions, (VI) two (or multi)-part and Tobit models, (VII) survival methods, (VIII) non-parametric methods, (IX) methods based on truncation or trimming of data, (X) data components models, (XI) methods based on averaging across models, and (XII) Markov chain methods. Based on this review, our recommendations are that, first, simple methods are preferred in large samples where the near-normality of sample means is assured. Second, in somewhat smaller samples, relatively simple methods, able to deal with one or two of above data characteristics, may be preferable but checking sensitivity to assumptions is necessary. Finally, some more complex methods hold promise, but are relatively untried; their implementation requires substantial expertise and they are not currently recommended for wider applied work. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20799344
Counting the peaks in the excitation function for precompound processes
NASA Astrophysics Data System (ADS)
Bonetti, R.; Hussein, M. S.; Mello, P. A.
1983-08-01
The "counting of maxima" method of Brink and Stephen, conventionally used for the extraction of the correlation width of statistical (compound nucleus) reactions, is generalized to include precompound processes as well. It is found that this method supplies an important independent check of the results obtained from autocorrelation studies. An application is made to the reaction 25Mg(3He,p). NUCLEAR REACTIONS Statistical multistep compound processes discussed.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Hematocrit levels as cardiovascular risk among taxi drivers in Bangkok, Thailand
ISHIMARU, Tomohiro; ARPHORN, Sara; JIRAPONGSUWAN, Ann
2016-01-01
In Thailand, taxi drivers employed in the informal sector often experience hazardous working conditions. Previous studies revealed that elevated Hematocrit (HCT) is a predictor of cardiovascular disease (CVD) risk. This study assessed factors associated with HCT in taxi drivers to predict their occupational CVD risk factors. A cross-sectional study was conducted on 298 male taxi drivers who joined a health check-up campaign in Bangkok, Thailand. HCT and body mass index were retrieved from participant health check-up files. Self-administered questionnaires assessed demographics, driving mileage, working hours, and lifestyle. Statistical associations were analyzed using stepwise linear regression. Our results showed that obesity (p=0.007), daily alcohol drinking (p=0.003), and current or past smoking (p=0.016) were associated with higher HCT levels. While working hours were not directly associated with HCT levels in the current study, the effect on overworking is statistically arguable because most participants worked substantially longer hours. Our findings suggest that taxi drivers’ CVD risk may be increased by their unhealthy work styles. Initiatives to improve general working conditions for taxi drivers should take into account health promotion and CVD prevention. The policy of providing periodic health check-ups is important to make workers in the informal sector aware of their health status. PMID:27151439
Hematocrit levels as cardiovascular risk among taxi drivers in Bangkok, Thailand.
Ishimaru, Tomohiro; Arphorn, Sara; Jirapongsuwan, Ann
2016-10-08
In Thailand, taxi drivers employed in the informal sector often experience hazardous working conditions. Previous studies revealed that elevated Hematocrit (HCT) is a predictor of cardiovascular disease (CVD) risk. This study assessed factors associated with HCT in taxi drivers to predict their occupational CVD risk factors. A cross-sectional study was conducted on 298 male taxi drivers who joined a health check-up campaign in Bangkok, Thailand. HCT and body mass index were retrieved from participant health check-up files. Self-administered questionnaires assessed demographics, driving mileage, working hours, and lifestyle. Statistical associations were analyzed using stepwise linear regression. Our results showed that obesity (p=0.007), daily alcohol drinking (p=0.003), and current or past smoking (p=0.016) were associated with higher HCT levels. While working hours were not directly associated with HCT levels in the current study, the effect on overworking is statistically arguable because most participants worked substantially longer hours. Our findings suggest that taxi drivers' CVD risk may be increased by their unhealthy work styles. Initiatives to improve general working conditions for taxi drivers should take into account health promotion and CVD prevention. The policy of providing periodic health check-ups is important to make workers in the informal sector aware of their health status.
On the validation of seismic imaging methods: Finite frequency or ray theory?
Maceira, Monica; Larmat, Carene; Porritt, Robert W.; ...
2015-01-23
We investigate the merits of the more recently developed finite-frequency approach to tomography against the more traditional and approximate ray theoretical approach for state of the art seismic models developed for western North America. To this end, we employ the spectral element method to assess the agreement between observations on real data and measurements made on synthetic seismograms predicted by the models under consideration. We check for phase delay agreement as well as waveform cross-correlation values. Based on statistical analyses on S wave phase delay measurements, finite frequency shows an improvement over ray theory. Random sampling using cross-correlation values identifiesmore » regions where synthetic seismograms computed with ray theory and finite-frequency models differ the most. Our study suggests that finite-frequency approaches to seismic imaging exhibit measurable improvement for pronounced low-velocity anomalies such as mantle plumes.« less
Empirical flow parameters : a tool for hydraulic model validity
Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.
2013-01-01
The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.
Reliability of excess-flow check-valves in turbine lubrication systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dundas, R.E.
1996-12-31
Reliability studies on excess-flow check valves installed in a gas turbine lubrication system for prevention of spray fires subsequent to fracture or separation of lube lines were conducted. Fault-tree analyses are presented for the case of failure of a valve to close when called upon by separation of a downstream line, as well as for the case of accidental closure during normal operation, leading to interruption of lubricating oil flow to a bearing. The probabilities of either of these occurrences are evaluated. The results of a statistical analysis of accidental closure of excess-flow check valves in commercial airplanes in themore » period 1986--91 are also given, as well as a summary of reliability studies on the use of these valves in residential gas installations, conducted under the sponsorship of the Gas Research Institute.« less
Iraguha, Blaise; Hamudikuwanda, Humphrey; Mushonga, Borden; Kandiwa, Erick; Mpatswenumugabo, Jean P
2017-06-21
Four subclinical mastitis diagnostic tests (the UdderCheck® test [a lactate dehydrogenasebased test], the California Mastitis Test [CMT], the Draminski® test [a conductivity-based test] and the PortaSCC® test [a portable somatic cell count-based test]) were compared in a study comprising crossbreed dairy cows (n = 30) during September and October 2015. Sensitivity and specificity of the CMT, Draminski® and UdderCheck® tests were compared with the PortaSCC® as reference. The CMT, Draminski® and UdderCheck® test results were compared with the results of the PortaSCC® test using kappa statistics. Duplicate quarter milk samples (n = 120) were concurrently subjected to the four tests. Sensitivity and specificity were 88.46% and 86.17% (CMT), 78.5% and 81.4% (Draminski®) and 64.00% and 78.95% (UdderCheck®). The CMT showed substantial agreement (k = 0.66), the Draminski® test showed moderate agreement (k = 0.48) and the UdderCheck® test showed fair agreement (k = 0.37) with the PortaSCC® test and positive likelihood ratios were 6.40, 4.15 and 3.04, respectively. The cow-level subclinical mastitis prevalence was 70%, 60%, 60% and 56.7% for PortaSCC®, CMT, Draminski® and UdderCheck® tests, respectively. At udder quarter level, subclinical mastitis prevalence was 20%, 21.67% and 20.83% for PortaSCC®, CMT and UdderCheck®, respectively. A correlation (P < 0.05) and moderate strength of association were found between the four tests used. The study showed that compared to the PortaSCC® test, the CMT was the most preferable option, followed by the Draminski® test, while the UdderCheck® test was the least preferable option for subclinical mastitis screening.
Meijer, Rob R; Niessen, A Susan M; Tendeiro, Jorge N
2016-02-01
Although there are many studies devoted to person-fit statistics to detect inconsistent item score patterns, most studies are difficult to understand for nonspecialists. The aim of this tutorial is to explain the principles of these statistics for researchers and clinicians who are interested in applying these statistics. In particular, we first explain how invalid test scores can be detected using person-fit statistics; second, we provide the reader practical examples of existing studies that used person-fit statistics to detect and to interpret inconsistent item score patterns; and third, we discuss a new R-package that can be used to identify and interpret inconsistent score patterns. © The Author(s) 2015.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Norman, Laura M.; Niraula, Rewati
2016-01-01
The objective of this study was to evaluate the effect of check dam infrastructure on soil and water conservation at the catchment scale using the Soil and Water Assessment Tool (SWAT). This paired watershed study includes a watershed treated with over 2000 check dams and a Control watershed which has none, in the West Turkey Creek watershed, Southeast Arizona, USA. SWAT was calibrated for streamflow using discharge documented during the summer of 2013 at the Control site. Model results depict the necessity to eliminate lateral flow from SWAT models of aridland environments, the urgency to standardize geospatial soils data, and the care for which modelers must document altering parameters when presenting findings. Performance was assessed using the percent bias (PBIAS), with values of ±2.34%. The calibrated model was then used to examine the impacts of check dams at the Treated watershed. Approximately 630 tons of sediment is estimated to be stored behind check dams in the Treated watershed over the 3-year simulation, increasing water quality for fish habitat. A minimum precipitation event of 15 mm was necessary to instigate the detachment of soil, sediments, or rock from the study area, which occurred 2% of the time. The resulting watershed model is useful as a predictive framework and decision-support tool to consider long-term impacts of restoration and potential for future restoration.
Calculating system reliability with SRFYDO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M
2010-01-01
SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for themore » system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.« less
Spatiotemporal Patterns of Urban Human Mobility
NASA Astrophysics Data System (ADS)
Hasan, Samiul; Schneider, Christian M.; Ukkusuri, Satish V.; González, Marta C.
2013-04-01
The modeling of human mobility is adopting new directions due to the increasing availability of big data sources from human activity. These sources enclose digital information about daily visited locations of a large number of individuals. Examples of these data include: mobile phone calls, credit card transactions, bank notes dispersal, check-ins in internet applications, among several others. In this study, we consider the data obtained from smart subway fare card transactions to characterize and model urban mobility patterns. We present a simple mobility model for predicting peoples' visited locations using the popularity of places in the city as an interaction parameter between different individuals. This ingredient is sufficient to reproduce several characteristics of the observed travel behavior such as: the number of trips between different locations in the city, the exploration of new places and the frequency of individual visits of a particular location. Moreover, we indicate the limitations of the proposed model and discuss open questions in the current state of the art statistical models of human mobility.
Lyons-Weiler, James; Pelikan, Richard; Zeh, Herbert J; Whitcomb, David C; Malehorn, David E; Bigbee, William L; Hauskrecht, Milos
2005-01-01
Peptide profiles generated using SELDI/MALDI time of flight mass spectrometry provide a promising source of patient-specific information with high potential impact on the early detection and classification of cancer and other diseases. The new profiling technology comes, however, with numerous challenges and concerns. Particularly important are concerns of reproducibility of classification results and their significance. In this work we describe a computational validation framework, called PACE (Permutation-Achieved Classification Error), that lets us assess, for a given classification model, the significance of the Achieved Classification Error (ACE) on the profile data. The framework compares the performance statistic of the classifier on true data samples and checks if these are consistent with the behavior of the classifier on the same data with randomly reassigned class labels. A statistically significant ACE increases our belief that a discriminative signal was found in the data. The advantage of PACE analysis is that it can be easily combined with any classification model and is relatively easy to interpret. PACE analysis does not protect researchers against confounding in the experimental design, or other sources of systematic or random error. We use PACE analysis to assess significance of classification results we have achieved on a number of published data sets. The results show that many of these datasets indeed possess a signal that leads to a statistically significant ACE.
Guillaume, Bryan; Wang, Changqing; Poh, Joann; Shen, Mo Jun; Ong, Mei Lyn; Tan, Pei Fang; Karnani, Neerja; Meaney, Michael; Qiu, Anqi
2018-06-01
Statistical inference on neuroimaging data is often conducted using a mass-univariate model, equivalent to fitting a linear model at every voxel with a known set of covariates. Due to the large number of linear models, it is challenging to check if the selection of covariates is appropriate and to modify this selection adequately. The use of standard diagnostics, such as residual plotting, is clearly not practical for neuroimaging data. However, the selection of covariates is crucial for linear regression to ensure valid statistical inference. In particular, the mean model of regression needs to be reasonably well specified. Unfortunately, this issue is often overlooked in the field of neuroimaging. This study aims to adopt the existing Confounder Adjusted Testing and Estimation (CATE) approach and to extend it for use with neuroimaging data. We propose a modification of CATE that can yield valid statistical inferences using Principal Component Analysis (PCA) estimators instead of Maximum Likelihood (ML) estimators. We then propose a non-parametric hypothesis testing procedure that can improve upon parametric testing. Monte Carlo simulations show that the modification of CATE allows for more accurate modelling of neuroimaging data and can in turn yield a better control of False Positive Rate (FPR) and Family-Wise Error Rate (FWER). We demonstrate its application to an Epigenome-Wide Association Study (EWAS) on neonatal brain imaging and umbilical cord DNA methylation data obtained as part of a longitudinal cohort study. Software for this CATE study is freely available at http://www.bioeng.nus.edu.sg/cfa/Imaging_Genetics2.html. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
New Results in Software Model Checking and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2010-01-01
This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.
Bayesian modelling of lung function data from multiple-breath washout tests.
Mahar, Robert K; Carlin, John B; Ranganathan, Sarath; Ponsonby, Anne-Louise; Vuillermin, Peter; Vukcevic, Damjan
2018-05-30
Paediatric respiratory researchers have widely adopted the multiple-breath washout (MBW) test because it allows assessment of lung function in unsedated infants and is well suited to longitudinal studies of lung development and disease. However, a substantial proportion of MBW tests in infants fail current acceptability criteria. We hypothesised that a model-based approach to analysing the data, in place of traditional simple empirical summaries, would enable more efficient use of these tests. We therefore developed a novel statistical model for infant MBW data and applied it to 1197 tests from 432 individuals from a large birth cohort study. We focus on Bayesian estimation of the lung clearance index, the most commonly used summary of lung function from MBW tests. Our results show that the model provides an excellent fit to the data and shed further light on statistical properties of the standard empirical approach. Furthermore, the modelling approach enables the lung clearance index to be estimated by using tests with different degrees of completeness, something not possible with the standard approach. Our model therefore allows previously unused data to be used rather than discarded, as well as routine use of shorter tests without significant loss of precision. Beyond our specific application, our work illustrates a number of important aspects of Bayesian modelling in practice, such as the importance of hierarchical specifications to account for repeated measurements and the value of model checking via posterior predictive distributions. Copyright © 2018 John Wiley & Sons, Ltd.
Weigh-in-Motion Sensor and Controller Operation and Performance Comparison
DOT National Transportation Integrated Search
2018-01-01
This research project utilized statistical inference and comparison techniques to compare the performance of different Weigh-in-Motion (WIM) sensors. First, we analyzed test-vehicle data to perform an accuracy check of the results reported by the sen...
NASA Astrophysics Data System (ADS)
Robichaud, A.; Ménard, R.
2013-05-01
We present multi-year objective analyses (OA) on a high spatio-temporal resolution (15 or 21 km, every hour) for the warm season period (1 May-31 October) for ground-level ozone (2002-2012) and for fine particulate matter (diameter less than 2.5 microns (PM2.5)) (2004-2012). The OA used here combines the Canadian Air Quality forecast suite with US and Canadian surface air quality monitoring sites. The analysis is based on an optimal interpolation with capabilities for adaptive error statistics for ozone and PM2.5 and an explicit bias correction scheme for the PM2.5 analyses. The estimation of error statistics has been computed using a modified version of the Hollingsworth-Lönnberg's (H-L) method. Various quality controls (gross error check, sudden jump test and background check) have been applied to the observations to remove outliers. An additional quality control is applied to check the consistency of the error statistics estimation model at each observing station and for each hour. The error statistics are further tuned "on the fly" using a χ2 (chi-square) diagnostic, a procedure which verifies significantly better than without tuning. Successful cross-validation experiments were performed with an OA set-up using 90% of observations to build the objective analysis and with the remainder left out as an independent set of data for verification purposes. Furthermore, comparisons with other external sources of information (global models and PM2.5 satellite surface derived measurements) show reasonable agreement. The multi-year analyses obtained provide relatively high precision with an absolute yearly averaged systematic error of less than 0.6 ppbv (parts per billion by volume) and 0.7 μg m-3 (micrograms per cubic meter) for ozone and PM2.5 respectively and a random error generally less than 9 ppbv for ozone and under 12 μg m-3 for PM2.5. In this paper, we focus on two applications: (1) presenting long term averages of objective analysis and analysis increments as a form of summer climatology and (2) analyzing long term (decadal) trends and inter-annual fluctuations using OA outputs. Our results show that high percentiles of ozone and PM2.5 are both following a decreasing trend overall in North America with the eastern part of United States (US) presenting the highest decrease likely due to more effective pollution controls. Some locations, however, exhibited an increasing trend in the mean ozone and PM2.5 such as the northwestern part of North America (northwest US and Alberta). The low percentiles are generally rising for ozone which may be linked to increasing emissions from emerging countries and the resulting pollution brought by the intercontinental transport. After removing the decadal trend, we demonstrate that the inter-annual fluctuations of the high percentiles are significantly correlated with temperature fluctuations for ozone and precipitation fluctuations for PM2.5. We also show that there was a moderately significant correlation between the inter-annual fluctuations of the high percentiles of ozone and PM2.5 with economic indices such as the Industrial Dow Jones and/or the US gross domestic product growth rate.
NASA Astrophysics Data System (ADS)
Borri, Claudia; Paggi, Marco
2015-02-01
The random process theory (RPT) has been widely applied to predict the joint probability distribution functions (PDFs) of asperity heights and curvatures of rough surfaces. A check of the predictions of RPT against the actual statistics of numerically generated random fractal surfaces and of real rough surfaces has been only partially undertaken. The present experimental and numerical study provides a deep critical comparison on this matter, providing some insight into the capabilities and limitations in applying RPT and fractal modeling to antireflective and hydrophobic rough surfaces, two important types of textured surfaces. A multi-resolution experimental campaign using a confocal profilometer with different lenses is carried out and a comprehensive software for the statistical description of rough surfaces is developed. It is found that the topology of the analyzed textured surfaces cannot be fully described according to RPT and fractal modeling. The following complexities emerge: (i) the presence of cut-offs or bi-fractality in the power-law power-spectral density (PSD) functions; (ii) a more pronounced shift of the PSD by changing resolution as compared to what was expected from fractal modeling; (iii) inaccuracy of the RPT in describing the joint PDFs of asperity heights and curvatures of textured surfaces; (iv) lack of resolution-invariance of joint PDFs of textured surfaces in case of special surface treatments, not accounted for by fractal modeling.
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-01-01
Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing. PMID:24567836
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-08-01
To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing.
Query Language for Location-Based Services: A Model Checking Approach
NASA Astrophysics Data System (ADS)
Hoareau, Christian; Satoh, Ichiro
We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.
Sugisawa, Hidehiro; Sugihara, Yoko
2011-09-01
Nursing care prevention programs cannot accomplish their goals without effective screening of pre-frail older people. Health check-up services provide a very opportunity for this purpose. In the present study we examined not only the direct and indirect effects of social networks on check-up service use among candidate pre-frail older people, but also whether these effects differ from those among older people in general. Subjects for this study were respondents of a survey for probability sampled aged 65 and over living in a city, Tokyo. Individuals who gave effective responses to items used in our analysis made up 55.8 percent of the sample. 734 candidate pre-frail older people were selected using the screening criteria provided by the ministry of Heath, Labor and Welfare. The general category of older people numbered 2,057, excluding the candidates and elderly certified for long-term care. Social networks were measured from five aspects: family size; contact with children or relatives living separately; contact with neighbors or friends; involvement in community activities; and seeing a doctor. Our model of indirect effects of social networks on check-up use included awareness of nursing care prevention programs as a mediating factor. Information about whether the subjects used the health check-up service was provided.by the regional government. Magnitude of the effects was evaluated from two aspects; using statistical tests and focusing on marginal effects. Although none of the social network indicators had direct significant impacts on check-up use, contact with children or relatives living separately, contact with neighbors or friends, or involvement with community activities demonstrated significant indirect influence. Contact with neighbors or friends, involvement with community activities, or seeing a doctor had direct significant effects on use among the general category of older people, but none of the social network indicators demonstrated significant indirect effects. Involvement with community activities had the strongest total (direct plus indirect) effects on the use in the social networks indicators among the candidates when viewed with the focus on marginal effects. However, it was estimated that the rate of use would raise only about 5 percent even if average frequency of contacts with community activities were to increase from less than one time to one time over a month among the candidates. It is suggested that effects of social networks on health check-up service use among candidates of pre-frail older people could be produced by improving awareness of nursing care prevention programs.
Hashmi, Noreen Rahat; Khan, Shazad Ali
2018-05-31
To check if mobile health (m-Health) short message service (SMS) can improve the knowledge and practice of the American Diabetic Association preventive care guidelines (ADA guidelines) recommendations among physicians. Quasi-experimental pre-post study design with a control group. The participants of the study were 62 medical officers/medical postgraduate trainees from two hospitals in Lahore, Pakistan. Pretested questionnaire was used to collect baseline information about physicians' knowledge and adherence according to the ADA guidelines. All the respondents attended 1-day workshop about the guidelines. The intervention group received regular reminders by SMS about the ADA guidelines for the next 5 months. Postintervention knowledge and practice scores of 13 variables were checked again using the same questionnaire. Statistical analysis included χ 2 and McNemar's tests for categorical variables and t-test for continuous variables. Pearson's correlation analysis was done to check correlation between knowledge and practice scores in the intervention group. P values of <0.05 were considered statistically significant. The total number of participating physicians was 62. Fifty-three (85.5%) respondents completed the study. Composite scores within the intervention group showed statistically significant improvement in knowledge (p<0.001) and practice (p<0.001) postintervention. The overall composite scores preintervention and postintervention also showed statistically significant difference of improvement in knowledge (p=0.002) and practice (p=0.001) between non-intervention and intervention groups. Adherence to individual 13 ADA preventive care guidelines level was noted to be suboptimal at baseline. Statistically significant improvement in the intervention group was seen in the following individual variables: review of symptoms of hypoglycaemia and hyperglycaemia, eye examination, neurological examination, lipid examination, referral to ophthalmologist, and counselling about non-smoking. m-Health technology can be a useful educational tool to help with improving knowledge and practice of diabetic guidelines. Future multicentre trials will help to scale this intervention for wider use in resource-limited countries. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Abstraction Techniques for Parameterized Verification
2006-11-01
approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite
Aslam, M N; Sudár, S; Hussain, M; Malik, A A; Shah, H A; Qaim, S M
2010-09-01
Cross-section data for the production of medically important radionuclide (124)I via five proton and deuteron induced reactions on enriched tellurium isotopes were evaluated. The nuclear model codes, STAPRE, EMPIRE and TALYS, were used for consistency checks of the experimental data. Recommended excitation functions were derived using a well-defined statistical procedure. Therefrom integral yields were calculated. The various production routes of (124)I were compared. Presently the (124)Te(p,n)(124)I reaction is the method of choice; however, the (125)Te(p,2n)(124)I reaction also appears to have great potential.
The Automation of Nowcast Model Assessment Processes
2016-09-01
that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data
Is a quasi-3D dosimeter better than a 2D dosimeter for Tomotherapy delivery quality assurance?
NASA Astrophysics Data System (ADS)
Xing, Aitang; Deshpande, Shrikant; Arumugam, Sankar; George, Armia; Holloway, Lois; Vial, Philip; Goozee, Gary
2015-01-01
Delivery quality assurance (DQA) has been performed for each Tomotherapy patient either using ArcCHECK or MatriXX Evolution in our clinic since 2012. ArcCHECK is a quasi-3D dosimeter whereas MatriXX is a 2D detector. A review of DQA results was performed for all patients in the last three years, a total of 221 DQA plans. These DQA plans came from 215 patients with a variety of treatment sites including head-neck, pelvis, and chest wall. The acceptable Gamma pass rate in our clinic is over 95% using 3mm and 3% of maximum planned dose with 10% dose threshold. The mean value and standard deviation of Gamma pass rates were 98.2% ± 1.98(1SD) for MatriXX and 98.5%±1.88 (1SD) for ArcCHECK. A paired t-test was also performed for the groups of patients whose DQA was performed with both the ArcCHECK and MatriXX. No statistical dependence was found in terms of the Gamma pass rate for ArcCHECK and MatriXX. The considered 3D and 2D dosimeters have achieved similar results in performing routine patient-specific DQA for patients treated on a TomoTherapy unit.
Temporal Precedence Checking for Switched Models and its Application to a Parallel Landing Protocol
NASA Technical Reports Server (NTRS)
Duggirala, Parasara Sridhar; Wang, Le; Mitra, Sayan; Viswanathan, Mahesh; Munoz, Cesar A.
2014-01-01
This paper presents an algorithm for checking temporal precedence properties of nonlinear switched systems. This class of properties subsume bounded safety and capture requirements about visiting a sequence of predicates within given time intervals. The algorithm handles nonlinear predicates that arise from dynamics-based predictions used in alerting protocols for state-of-the-art transportation systems. It is sound and complete for nonlinear switch systems that robustly satisfy the given property. The algorithm is implemented in the Compare Execute Check Engine (C2E2) using validated simulations. As a case study, a simplified model of an alerting system for closely spaced parallel runways is considered. The proposed approach is applied to this model to check safety properties of the alerting logic for different operating conditions such as initial velocities, bank angles, aircraft longitudinal separation, and runway separation.
The method of a joint intraday security check system based on cloud computing
NASA Astrophysics Data System (ADS)
Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng
2017-01-01
The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.
Experimental study of the γ p → π 0 η p reaction with the A2 setup at the Mainz Microtron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokhoyan, V.; Prakhov, S.; Fix, A.
2018-05-29
Here, the data available from the A2 Collaboration at MAMI were analyzed to select the γp → π 0ηp reaction on an event-by-event basis, which allows for partial-wave analyses of three-body final states to obtain more reliable results, compared to fits to measured distributions. These data provide the world’s best statistical accuracy in the energy range from threshold to E γ = 1.45 GeV, allowing a finer energy binning in the measurement of all observables needed for understanding the reaction dynamics. The results obtained for the measured observables are compared to existing models, and the impact from the new datamore » is checked by the fit with the revised Mainz model.« less
76 FR 35344 - Airworthiness Directives; Costruzioni Aeronautiche Tecnam srl Model P2006T Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-17
... retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on the nose landing... specified products. The MCAI states: During Landing Gear retraction/extension ground checks performed on the... airworthiness information (MCAI) states: During Landing Gear retraction/extension ground checks performed on the...
Molecular dynamics of conformational substates for a simplified protein model
NASA Astrophysics Data System (ADS)
Grubmüller, Helmut; Tavan, Paul
1994-09-01
Extended molecular dynamics simulations covering a total of 0.232 μs have been carried out on a simplified protein model. Despite its simplified structure, that model exhibits properties similar to those of more realistic protein models. In particular, the model was found to undergo transitions between conformational substates at a time scale of several hundred picoseconds. The computed trajectories turned out to be sufficiently long as to permit a statistical analysis of that conformational dynamics. To check whether effective descriptions neglecting memory effects can reproduce the observed conformational dynamics, two stochastic models were studied. A one-dimensional Langevin effective potential model derived by elimination of subpicosecond dynamical processes could not describe the observed conformational transition rates. In contrast, a simple Markov model describing the transitions between but neglecting dynamical processes within conformational substates reproduced the observed distribution of first passage times. These findings suggest, that protein dynamics generally does not exhibit memory effects at time scales above a few hundred picoseconds, but confirms the existence of memory effects at a picosecond time scale.
Emotional Intelligence in Secondary Education Students in Multicultural Contexts
ERIC Educational Resources Information Center
Pegalajar-Palomino, Ma. del Carmen; Colmenero-Ruiz, Ma. Jesus
2014-01-01
Introduction: The study analyzes the level of development in emotional intelligence of Secondary Education students. It also checks for statistically significant differences in educational level between Spanish and immigrant students, under the integration program "Intercultural Open Classrooms". Method: 94 students of Secondary…
Sediment trapping efficiency of adjustable check dam in laboratory and field experiment
NASA Astrophysics Data System (ADS)
Wang, Chiang; Chen, Su-Chin; Lu, Sheng-Jui
2014-05-01
Check dam has been constructed at mountain area to block debris flow, but has been filled after several events and lose its function of trapping. For the reason, the main facilities of our research is the adjustable steel slit check dam, which with the advantages of fast building, easy to remove or adjust it function. When we can remove transverse beams to drain sediments off and keep the channel continuity. We constructed adjustable steel slit check dam on the Landow torrent, Huisun Experiment Forest station as the prototype to compare with model in laboratory. In laboratory experiments, the Froude number similarity was used to design the dam model. The main comparisons focused on types of sediment trapping and removing, sediment discharge, and trapping rate of slit check dam. In different types of removing transverse beam showed different kind of sediment removal and differences on rate of sediment removing, removing rate, and particle size distribution. The sediment discharge in check dam with beams is about 40%~80% of check dam without beams. Furthermore, the spacing of beams is considerable factor to the sediment discharge. In field experiment, this research uses time-lapse photography to record the adjustable steel slit check dam on the Landow torrent. The typhoon Soulik made rainfall amounts of 600 mm in eight hours and induced debris flow in Landow torrent. Image data of time-lapse photography demonstrated that after several sediment transport event the adjustable steel slit check dam was buried by debris flow. The result of lab and field experiments: (1)Adjustable check dam could trap boulders and stop woody debris flow and flush out fine sediment to supply the need of downstream river. (2)The efficiency of sediment trapping in adjustable check dam with transverse beams was significantly improved. (3)The check dam without transverse beams can remove the sediment and keep the ecosystem continuity.
Eagle, Dawn M; Noschang, Cristie; d'Angelo, Laure-Sophie Camilla; Noble, Christie A; Day, Jacob O; Dongelmans, Marie Louise; Theobald, David E; Mar, Adam C; Urcelay, Gonzalo P; Morein-Zamir, Sharon; Robbins, Trevor W
2014-05-01
Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an 'observing' lever for information about the location of an 'active' lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, R; Zhu, X; Li, S
Purpose: High Dose Rate (HDR) brachytherapy forward planning is principally an iterative process; hence, plan quality is affected by planners’ experiences and limited planning time. Thus, this may lead to sporadic errors and inconsistencies in planning. A statistical tool based on previous approved clinical treatment plans would help to maintain the consistency of planning quality and improve the efficiency of second checking. Methods: An independent dose calculation tool was developed from commercial software. Thirty-three previously approved cervical HDR plans with the same prescription dose (550cGy), applicator type, and treatment protocol were examined, and ICRU defined reference point doses (bladder, vaginalmore » mucosa, rectum, and points A/B) along with dwell times were collected. Dose calculation tool then calculated appropriate range with a 95% confidence interval for each parameter obtained, which would be used as the benchmark for evaluation of those parameters in future HDR treatment plans. Model quality was verified using five randomly selected approved plans from the same dataset. Results: Dose variations appears to be larger at the reference point of bladder and mucosa as compared with rectum. Most reference point doses from verification plans fell between the predicted range, except the doses of two points of rectum and two points of reference position A (owing to rectal anatomical variations & clinical adjustment in prescription points, respectively). Similar results were obtained for tandem and ring dwell times despite relatively larger uncertainties. Conclusion: This statistical tool provides an insight into clinically acceptable range of cervical HDR plans, which could be useful in plan checking and identifying potential planning errors, thus improving the consistency of plan quality.« less
User's manual for computer program BASEPLOT
Sanders, Curtis L.
2002-01-01
The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.
Discontinued dental attendance among elderly people in Sweden
Grönbeck-Linden, Ingela; Hägglin, Catharina; Petersson, Anita; Linander, Per O.; Gahnberg, Lars
2016-01-01
Aim: Our objective was to study the loss of dental attendance and a possible age trend among patients aged ≥65 years in Sweden. Regular dental check-ups are considered to be an important factor in maintaining oral health. Approximately 80% of the adult population in Sweden are enrolled in a regular check-up system; however, dental practitioners often find that older patients attend fewer check-ups. Old people may naturally lose contact with dental services as they move to special housing or die. In this systematic study, these factors were investigated and used as exclusion criteria. Materials and Methods: Data were collected for all patients (n = 4759) aged 65 or older from the electronic journal system in 3 large public dental clinics in 3 communities. Their dental records for the years 2004–2009 were studied longitudinally by 1 person at each clinic; 1111 patients were excluded (patients died during study period, wanted emergency care only, obtained special dental care allowance, moved from the community or moved to special housing, or left the clinic for another caregiver). The statistical analyses were performed using the Statistical Package for the Social Sciences version 21 (IBM). Results: Of the 3648 patients (1690 men and 1958 women) included in the study, 13% lost contact with their dental service over the course of the study (10% of those were aged 65–79 and 21% ≥80). The decrease in regular dental contact had a statistically significant association with increasing age (P < 0.001). Conclusion: A considerable number of older people living independently or with moderate supportive care in their own homes lost contact with dental service despite enrolment in a recall system. PMID:27382538
Repeatability Modeling for Wind-Tunnel Measurements: Results for Three Langley Facilities
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Houlden, Heather P.
2014-01-01
Data from extensive check standard tests of seven measurement processes in three NASA Langley Research Center wind tunnels are statistically analyzed to test a simple model previously presented in 2000 for characterizing short-term, within-test and across-test repeatability. The analysis is intended to support process improvement and development of uncertainty models for the measurements. The analysis suggests that the repeatability can be estimated adequately as a function of only the test section dynamic pressure over a two-orders- of-magnitude dynamic pressure range. As expected for low instrument loading, short-term coefficient repeatability is determined by the resolution of the instrument alone (air off). However, as previously pointed out, for the highest dynamic pressure range the coefficient repeatability appears to be independent of dynamic pressure, thus presenting a lower floor for the standard deviation for all three time frames. The simple repeatability model is shown to be adequate for all of the cases presented and for all three time frames.
Parametric regression model for survival data: Weibull regression model as an example
2016-01-01
Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
NASA Astrophysics Data System (ADS)
Brigatti, E.; Vieira, M. V.; Kajin, M.; Almeida, P. J. A. L.; de Menezes, M. A.; Cerqueira, R.
2016-02-01
We study the population size time series of a Neotropical small mammal with the intent of detecting and modelling population regulation processes generated by density-dependent factors and their possible delayed effects. The application of analysis tools based on principles of statistical generality are nowadays a common practice for describing these phenomena, but, in general, they are more capable of generating clear diagnosis rather than granting valuable modelling. For this reason, in our approach, we detect the principal temporal structures on the bases of different correlation measures, and from these results we build an ad-hoc minimalist autoregressive model that incorporates the main drivers of the dynamics. Surprisingly our model is capable of reproducing very well the time patterns of the empirical series and, for the first time, clearly outlines the importance of the time of attaining sexual maturity as a central temporal scale for the dynamics of this species. In fact, an important advantage of this analysis scheme is that all the model parameters are directly biologically interpretable and potentially measurable, allowing a consistency check between model outputs and independent measurements.
Analyzing Planck and low redshift data sets with advanced statistical methods
NASA Astrophysics Data System (ADS)
Eifler, Tim
The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi-probe analysis proposed here we will use the existing CosmoLike software, a computationally efficient analysis framework that is unique in its integrated ansatz of jointly analyzing probes of large-scale structure (LSS) of the Universe. We plan to combine CosmoLike with publicly available CMB analysis software (Camb, CLASS) to include modeling capabilities of CMB temperature, polarization, and lensing measurements. The resulting analysis framework will be capable to independently and jointly analyze data from the CMB and from various probes of the LSS of the Universe. After completion we will utilize this framework to check for consistency amongst the individual probes and subsequently run a joint likelihood analysis of probes that are not in tension. The inclusion of Planck information in a joint likelihood analysis substantially reduces DES uncertainties in cosmological parameters, and allows for unprecedented constraints on parameters that describe astrophysics. In their recent review Observational Probes of Cosmic Acceleration (Weinberg et al 2013) the authors emphasize the value of a balanced program that employs several of the most powerful methods in combination, both to cross-check systematic uncertainties and to take advantage of complementary information. The work we propose follows exactly this idea: 1) cross-checking existing Planck results with alternative methods in the data analysis, 2) checking for consistency of Planck and DES data, and 3) running a joint analysis to constrain cosmology and astrophysics. It is now expedient to develop and refine multi-probe analysis strategies that allow the comparison and inclusion of information from disparate probes to optimally obtain cosmology and astrophysics. Analyzing Planck and DES data poses an ideal opportunity for this purpose and corresponding lessons will be of great value for the science preparation of Euclid and WFIRST.
NASA Astrophysics Data System (ADS)
Fortugno, Diego; Zema, Demetrio Antonio; Bombino, Giuseppe; Tamburino, Vincenzo; Quinonero Rubio, Juan Manuel; Boix-Fayos, Carolina
2016-04-01
In Mediterranean semi-arid conditions the geomorphic effects of land-use changes and check dam installation on active channel headwater morphology are not completely understood. In such environments, the availability of specific studies, which monitor channel adjustments as a response to reforestation and check dams over representative observation periods, could help develop new management strategies and erosion control measures. This investigation is an integrated approach assessing the adjustments of channel morphology in a typical torrent (Sant'Agata, Calabria, Southern Italy) after land-use changes (e.g. fire, reforestation, land abandonment) and check dam construction across a period of about 60 years (1955-2012). A statistical analysis of historical rainfall records, an analysis of land-use change in the catchment area and a geomorphological mapping of channel adjustments were carried out and combined with field surveys of bed surface grain-size over a 5-km reach including 14 check dams. The analysis of the historical rainfall records showed a slight decrease in the amount and erosivity of precipitation. Mapping of land-use changes highlighted a general increase of vegetal coverage on the slopes adjacent to the monitored reaches. Together with the check dam network installation, this increase could have induced a reduction in water and sediment supply. The different erosional and depositional forms and adjustments showed a general narrowing between consecutive check dams together with local modifications detected upstream (bed aggradation and cross section expansion together with low-flow realignments) and downstream (local incision) of the installed check dams. Changes in the torrent bends were also detected as a response to erosional and depositional processes with different intensities. The study highlighted: (i) the efficiency of check dams against the disrupting power of the most intense floods by stabilising the active channel; and (ii) the influence of reforestation in increasing hillslope protection from erosion and disconnectivity of water and sediment flows towards the active channel. The residual sediment deficit circulating in the watershed suggests the need of slight management interventions, as, for instance, the conversion of the existing check dams into open structures, allowing a definite channel and coast stability.
An experimental method to verify soil conservation by check dams on the Loess Plateau, China.
Xu, X Z; Zhang, H W; Wang, G Q; Chen, S C; Dang, W Q
2009-12-01
A successful experiment with a physical model requires necessary conditions of similarity. This study presents an experimental method with a semi-scale physical model. The model is used to monitor and verify soil conservation by check dams in a small watershed on the Loess Plateau of China. During experiments, the model-prototype ratio of geomorphic variables was kept constant under each rainfall event. Consequently, experimental data are available for verification of soil erosion processes in the field and for predicting soil loss in a model watershed with check dams. Thus, it can predict the amount of soil loss in a catchment. This study also mentions four criteria: similarities of watershed geometry, grain size and bare land, Froude number (Fr) for rainfall event, and soil erosion in downscaled models. The efficacy of the proposed method was confirmed using these criteria in two different downscaled model experiments. The B-Model, a large scale model, simulates watershed prototype. The two small scale models, D(a) and D(b), have different erosion rates, but are the same size. These two models simulate hydraulic processes in the B-Model. Experiment results show that while soil loss in the small scale models was converted by multiplying the soil loss scale number, it was very close to that of the B-Model. Obviously, with a semi-scale physical model, experiments are available to verify and predict soil loss in a small watershed area with check dam system on the Loess Plateau, China.
Molla, Azaher Ali; Chi, Chunhuei; Mondaca, Alicia Lorena Núñez
2017-01-31
Predictors of high out-of-pocket household healthcare expenditure are essential for creating effective health system finance policy. In Bangladesh, 63.3% of health expenditure is out-of-pocket and born by households. It is imperative to know what determines household health expenditure. This study aims to investigate the predicting factors of high out-of-pocket household healthcare expenditure targeting to put forward policy recommendations on equity in financial burden. Bangladesh household income and expenditure survey 2010 provides data for this study. Predictors of high out-of-pocket household healthcare expenditure were analyzed using multiple linear regressions. We have modeled non-linear relationship using logarithmic form of linear regression. Heteroscedasticity and multicollinearity were checked using Breusch-Pagan/Cook-Weishberg and VIF tests. Normality of the residuals was checked using Kernel density curve. We applied required adjustment for survey data, so that standard errors and parameters estimation are valid. Presence of chronic disease and household income were found to be the most influential and statistically significant (p < 0.001) predictors of high household healthcare expenditure. Households in rural areas spend 7% less than urban dwellers. The results show that a 100% increase in female members in a family leads to a 2% decrease in household health expenditure. Household income, health shocks in families, and family size are other statistically significant predictors of household healthcare expenditure. Proportion of elderly and under-five members in the family show some positive influence on health expenditure, though statistically nonsignificant. The findings call for emphasizing prevention of chronic diseases, as it is a strong predictor of household health expenditure. Innovative insurance scheme needs to be devised to prevent household from being impoverished due to health shocks in the family. Policy makers are urged to design an alternative source of healthcare financing in Bangladesh to minimize the burden of high OOP healthcare expenditure.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-17
... BD- 100 Time Limits/Maintenance Checks. The actions described in this service information are... Challenger 300 BD-100 Time Limits/Maintenance Checks. (1) For the new tasks identified in Bombardier TR 5-2... Requirements,'' in Part 2 of Chapter 5 of Bombardier Challenger 300 BD-100 Time Limits/ Maintenance Checks...
75 FR 66655 - Airworthiness Directives; PILATUS Aircraft Ltd. Model PC-7 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-29
... December 3, 2010 (the effective date of this AD), check the airplane maintenance records to determine if... of the airplane. Do this check following paragraph 3.A. of Pilatus Aircraft Ltd. PC-7 Service... maintenance records check required in paragraph (f)(1) of this AD or it is unclear whether or not the left and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-05
... Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual. For this task, the initial compliance..., of Part 2, of the Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual, the general.../Maintenance Checks Manual, provided that the relevant information in the general revision is identical to that...
Morse Code, Scrabble, and the Alphabet
ERIC Educational Resources Information Center
Richardson, Mary; Gabrosek, John; Reischman, Diann; Curtiss, Phyliss
2004-01-01
In this paper we describe an interactive activity that illustrates simple linear regression. Students collect data and analyze it using simple linear regression techniques taught in an introductory applied statistics course. The activity is extended to illustrate checks for regression assumptions and regression diagnostics taught in an…
Hendriks, Jacqueline; Fyfe, Sue; Styles, Irene; Skinner, S Rachel; Merriman, Gareth
2012-01-01
Measurement scales seeking to quantify latent traits like attitudes, are often developed using traditional psychometric approaches. Application of the Rasch unidimensional measurement model may complement or replace these techniques, as the model can be used to construct scales and check their psychometric properties. If data fit the model, then a scale with invariant measurement properties, including interval-level scores, will have been developed. This paper highlights the unique properties of the Rasch model. Items developed to measure adolescent attitudes towards abortion are used to exemplify the process. Ten attitude and intention items relating to abortion were answered by 406 adolescents aged 12 to 19 years, as part of the "Teen Relationships Study". The sampling framework captured a range of sexual and pregnancy experiences. Items were assessed for fit to the Rasch model including checks for Differential Item Functioning (DIF) by gender, sexual experience or pregnancy experience. Rasch analysis of the original dataset initially demonstrated that some items did not fit the model. Rescoring of one item (B5) and removal of another (L31) resulted in fit, as shown by a non-significant item-trait interaction total chi-square and a mean log residual fit statistic for items of -0.05 (SD=1.43). No DIF existed for the revised scale. However, items did not distinguish as well amongst persons with the most intense attitudes as they did for other persons. A person separation index of 0.82 indicated good reliability. Application of the Rasch model produced a valid and reliable scale measuring adolescent attitudes towards abortion, with stable measurement properties. The Rasch process provided an extensive range of diagnostic information concerning item and person fit, enabling changes to be made to scale items. This example shows the value of the Rasch model in developing scales for both social science and health disciplines.
Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano
2011-01-01
The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.
Health check documentation of psychosocial factors using the WAI.
Uronen, L; Heimonen, J; Puukka, P; Martimo, K-P; Hartiala, J; Salanterä, S
2017-03-01
Health checks in occupational health (OH) care should prevent deterioration of work ability and promote well-being at work. Documentation of health checks should reflect and support continuity of prevention and practice. To analyse how OH nurses (OHNs) undertaking health checks document psychosocial factors at work and use the Work Ability Index (WAI). Analysis of two consecutive OHN health check records and WAI scores with statistical analyses and annotations of 13 psychosocial factors based on a publicly available standard on psychosocial risk management: British Standards Institution specification PAS 1010, part of European Council Directive 89/391/EEC, with a special focus on work-related stress and workplace violence. We analysed health check records for 196 employees. The most frequently documented psychosocial risk factors were home-work interface, work environment and equipment, job content, workload and work pace and work schedule. The correlations between the number of documented risk and non-risk factors and WAI scores were significant: OHNs documented more risk factors in employees with lower WAI scores. However, documented psychosocial risk factors were not followed up, and the OHNs' most common response to detected psychosocial risks was an appointment with a physician. The number of psychosocial risk factors documented by OHNs correlated with subjects' WAI scores. However, the documentation was not systematic and the interventions were not always relevant. OHNs need a structure to document psychosocial factors and more guidance in how to use the documentation as a tool in their decision making in health checks. © The Author 2016. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com
"I share, therefore I am": personality traits, life satisfaction, and Facebook check-ins.
Wang, Shaojung Sharon
2013-12-01
This study explored whether agreeableness, extraversion, and openness function to influence self-disclosure behavior, which in turn impacts the intensity of checking in on Facebook. A complete path from extraversion to Facebook check-in through self-disclosure and sharing was found. The indirect effect from sharing to check-in intensity through life satisfaction was particularly salient. The central component of check-in is for users to disclose a specific location selectively that has implications on demonstrating their social lives, lifestyles, and tastes, enabling a selective and optimized self-image. Implications on the hyperpersonal model and warranting principle are discussed.
pcr: an R package for quality assessment, analysis and testing of qPCR data
Ahmed, Mahmoud
2018-01-01
Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953
A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process
NASA Technical Reports Server (NTRS)
Wang, Yi; Tamai, Tetsuo
2009-01-01
Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.
Efficient Translation of LTL Formulae into Buchi Automata
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Lerda, Flavio
2001-01-01
Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.
Concrete Model Checking with Abstract Matching and Refinement
NASA Technical Reports Server (NTRS)
Pasareanu Corina S.; Peianek Radek; Visser, Willem
2005-01-01
We propose an abstraction-based model checking method which relies on refinement of an under-approximation of the feasible behaviors of the system under analysis. The method preserves errors to safety properties, since all analyzed behaviors are feasible by definition. The method does not require an abstract transition relation to he generated, but instead executes the concrete transitions while storing abstract versions of the concrete states, as specified by a set of abstraction predicates. For each explored transition. the method checks, with the help of a theorem prover, whether there is any loss of precision introduced by abstraction. The results of these checks are used to decide termination or to refine the abstraction, by generating new abstraction predicates. If the (possibly infinite) concrete system under analysis has a finite bisimulation quotient, then the method is guaranteed to eventually explore an equivalent finite bisimilar structure. We illustrate the application of the approach for checking concurrent programs. We also show how a lightweight variant can be used for efficient software testing.
Galvanin, Federico; Ballan, Carlo C; Barolo, Massimiliano; Bezzo, Fabrizio
2013-08-01
The use of pharmacokinetic (PK) and pharmacodynamic (PD) models is a common and widespread practice in the preliminary stages of drug development. However, PK-PD models may be affected by structural identifiability issues intrinsically related to their mathematical formulation. A preliminary structural identifiability analysis is usually carried out to check if the set of model parameters can be uniquely determined from experimental observations under the ideal assumptions of noise-free data and no model uncertainty. However, even for structurally identifiable models, real-life experimental conditions and model uncertainty may strongly affect the practical possibility to estimate the model parameters in a statistically sound way. A systematic procedure coupling the numerical assessment of structural identifiability with advanced model-based design of experiments formulations is presented in this paper. The objective is to propose a general approach to design experiments in an optimal way, detecting a proper set of experimental settings that ensure the practical identifiability of PK-PD models. Two simulated case studies based on in vitro bacterial growth and killing models are presented to demonstrate the applicability and generality of the methodology to tackle model identifiability issues effectively, through the design of feasible and highly informative experiments.
Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles
NASA Technical Reports Server (NTRS)
Gamble, Ed
2012-01-01
Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses
NASA Technical Reports Server (NTRS)
Gamble, Ed; Holzmann, Gerard
2011-01-01
Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses
Graffelman, Jan; Sánchez, Milagros; Cook, Samantha; Moreno, Victor
2013-01-01
In genetic association studies, tests for Hardy-Weinberg proportions are often employed as a quality control checking procedure. Missing genotypes are typically discarded prior to testing. In this paper we show that inference for Hardy-Weinberg proportions can be biased when missing values are discarded. We propose to use multiple imputation of missing values in order to improve inference for Hardy-Weinberg proportions. For imputation we employ a multinomial logit model that uses information from allele intensities and/or neighbouring markers. Analysis of an empirical data set of single nucleotide polymorphisms possibly related to colon cancer reveals that missing genotypes are not missing completely at random. Deviation from Hardy-Weinberg proportions is mostly due to a lack of heterozygotes. Inbreeding coefficients estimated by multiple imputation of the missings are typically lowered with respect to inbreeding coefficients estimated by discarding the missings. Accounting for missings by multiple imputation qualitatively changed the results of 10 to 17% of the statistical tests performed. Estimates of inbreeding coefficients obtained by multiple imputation showed high correlation with estimates obtained by single imputation using an external reference panel. Our conclusion is that imputation of missing data leads to improved statistical inference for Hardy-Weinberg proportions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... Maintenance Manual (AMM) includes chapters 05-10 ``Time Limits'', 05-15 ``Critical Design Configuration... 05, ``Time Limits/Maintenance Checks,'' of BAe 146 Series/AVRO 146-RJ Series Aircraft Maintenance... Chapter 05, ``Time Limits/ Maintenance Checks,'' of the BAE SYSTEMS (Operations) Limited BAe 146 Series...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... Sikorsky Model S-64E helicopters. The AD requires repetitive checks of the Blade Inspection Method (BIM... and check procedures for BIM blades installed on the Model S-64F helicopters. Several blade spars with a crack emanating from corrosion pits and other damage have been found because of BIM pressure...
[Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].
Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D
2016-01-01
The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine influence on wine's flavour properties. It has been shown the nonvolatile components contribute in organoleptic and flavour quality estimation of wines as aromatic volatile substances but they take part in forming the expert's evaluation.
Slicing AADL Specifications for Model Checking
NASA Technical Reports Server (NTRS)
Odenbrett, Maximilian; Nguyen, Viet Yen; Noll, Thomas
2010-01-01
To combat the state-space explosion problem in model checking larger systems, abstraction techniques can be employed. Here, methods that operate on the system specification before constructing its state space are preferable to those that try to minimize the resulting transition system as they generally reduce peak memory requirements. We sketch a slicing algorithm for system specifications written in (a variant of) the Architecture Analysis and Design Language (AADL). Given a specification and a property to be verified, it automatically removes those parts of the specification that are irrelevant for model checking the property, thus reducing the size of the corresponding transition system. The applicability and effectiveness of our approach is demonstrated by analyzing the state-space reduction for an example, employing a translator from AADL to Promela, the input language of the SPIN model checker.
Investigation of estimators of probability density functions
NASA Technical Reports Server (NTRS)
Speed, F. M.
1972-01-01
Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.
Computer programs and documentation
NASA Technical Reports Server (NTRS)
Speed, F. M.; Broadwater, S. L.
1971-01-01
Various statistical tests that were used to check out random number generators are described. A total of twelve different tests were considered, and from these, six were chosen to be used. The frequency test, max t test, run test, lag product test, gap test, and the matrix test are included.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-22
... (Low Stage Bleed Check Valve) specified in Section 1 of the EMBRAER 170 Maintenance Review Board Report...-11-02-002 (Low Stage Bleed Check Valve), specified in Section 1 of the EMBRAER 170 Maintenance Review... Task 36-11-02-002 (Low Stage Bleed Check Valve) specified in Section 1 of the EMBRAER 170 Maintenance...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
... maintenance plan to include repetitive functional tests of the low-stage check valve. For certain other... program to include maintenance Task Number 36-11-02- 002 (Low Stage Bleed Check Valve), specified in... Check Valve) in Section 1 of the EMBRAER 170 Maintenance Review Board Report MRB-1621. Issued in Renton...
75 FR 39811 - Airworthiness Directives; The Boeing Company Model 777 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-13
... Service Bulletin 777-57A0064, dated March 26, 2009, it is not necessary to perform the torque check on the... instructions in Boeing Alert Service Bulletin 777-57A0064, dated March 26, 2009, a torque check is redundant... are less than those for the torque check. Boeing notes that it plans to issue a new revision to this...
Too much ado about instrumental variable approach: is the cure worse than the disease?
Baser, Onur
2009-01-01
To review the efficacy of instrumental variable (IV) models in addressing a variety of assumption violations to ensure standard ordinary least squares (OLS) estimates are consistent. IV models gained popularity in outcomes research because of their ability to consistently estimate the average causal effects even in the presence of unmeasured confounding. However, in order for this consistent estimation to be achieved, several conditions must hold. In this article, we provide an overview of the IV approach, examine possible tests to check the prerequisite conditions, and illustrate how weak instruments may produce inconsistent and inefficient results. We use two IVs and apply Shea's partial R-square method, the Anderson canonical correlation, and Cragg-Donald tests to check for weak instruments. Hall-Peixe tests are applied to see if any of these instruments are redundant in the analysis. A total of 14,952 asthma patients from the MarketScan Commercial Claims and Encounters Database were examined in this study. Patient health care was provided under a variety of fee-for-service, fully capitated, and partially capitated health plans, including preferred provider organizations, point of service plans, indemnity plans, and health maintenance organizations. We used controller-reliever copay ratio and physician practice/prescribing patterns as an instrument. We demonstrated that the former was a weak and redundant instrument producing inconsistent and inefficient estimates of the effect of treatment. The results were worse than the results from standard regression analysis. Despite the obvious benefit of IV models, the method should not be used blindly. Several strong conditions are required for these models to work, and each of them should be tested. Otherwise, bias and precision of the results will be statistically worse than the results achieved by simply using standard OLS.
Roy, Kunal; Mitra, Indrani
2011-07-01
Quantitative structure-activity relationships (QSARs) have important applications in drug discovery research, environmental fate modeling, property prediction, etc. Validation has been recognized as a very important step for QSAR model development. As one of the important objectives of QSAR modeling is to predict activity/property/toxicity of new chemicals falling within the domain of applicability of the developed models and QSARs are being used for regulatory decisions, checking reliability of the models and confidence of their predictions is a very important aspect, which can be judged during the validation process. One prime application of a statistically significant QSAR model is virtual screening for molecules with improved potency based on the pharmacophoric features and the descriptors appearing in the QSAR model. Validated QSAR models may also be utilized for design of focused libraries which may be subsequently screened for the selection of hits. The present review focuses on various metrics used for validation of predictive QSAR models together with an overview of the application of QSAR models in the fields of virtual screening and focused library design for diverse series of compounds with citation of some recent examples.
NASA Astrophysics Data System (ADS)
El Sharif, H.; Teegavarapu, R. S.
2012-12-01
Spatial interpolation methods used for estimation of missing precipitation data at a site seldom check for their ability to preserve site and regional statistics. Such statistics are primarily defined by spatial correlations and other site-to-site statistics in a region. Preservation of site and regional statistics represents a means of assessing the validity of missing precipitation estimates at a site. This study evaluates the efficacy of a fuzzy-logic methodology for infilling missing historical daily precipitation data in preserving site and regional statistics. Rain gauge sites in the state of Kentucky, USA, are used as a case study for evaluation of this newly proposed method in comparison to traditional data infilling techniques. Several error and performance measures will be used to evaluate the methods and trade-offs in accuracy of estimation and preservation of site and regional statistics.
Mehdipoor, Hamed; Zurita-Milla, Raul; Rosemartin, Alyssa; Gerst, Katharine L.; Weltzin, Jake F.
2015-01-01
Recent improvements in online information communication and mobile location-aware technologies have led to the production of large volumes of volunteered geographic information. Widespread, large-scale efforts by volunteers to collect data can inform and drive scientific advances in diverse fields, including ecology and climatology. Traditional workflows to check the quality of such volunteered information can be costly and time consuming as they heavily rely on human interventions. However, identifying factors that can influence data quality, such as inconsistency, is crucial when these data are used in modeling and decision-making frameworks. Recently developed workflows use simple statistical approaches that assume that the majority of the information is consistent. However, this assumption is not generalizable, and ignores underlying geographic and environmental contextual variability that may explain apparent inconsistencies. Here we describe an automated workflow to check inconsistency based on the availability of contextual environmental information for sampling locations. The workflow consists of three steps: (1) dimensionality reduction to facilitate further analysis and interpretation of results, (2) model-based clustering to group observations according to their contextual conditions, and (3) identification of inconsistent observations within each cluster. The workflow was applied to volunteered observations of flowering in common and cloned lilac plants (Syringa vulgaris and Syringa x chinensis) in the United States for the period 1980 to 2013. About 97% of the observations for both common and cloned lilacs were flagged as consistent, indicating that volunteers provided reliable information for this case study. Relative to the original dataset, the exclusion of inconsistent observations changed the apparent rate of change in lilac bloom dates by two days per decade, indicating the importance of inconsistency checking as a key step in data quality assessment for volunteered geographic information. Initiatives that leverage volunteered geographic information can adapt this workflow to improve the quality of their datasets and the robustness of their scientific analyses.
Development and in-flight performance of the Mariner 9 spacecraft propulsion system
NASA Technical Reports Server (NTRS)
Evans, D. D.; Cannova, R. D.; Cork, M. J.
1973-01-01
On November 14, 1971, Mariner 9 was decelerated into orbit about Mars by a 1334 N (300 lbf) liquid bipropellant propulsion system. This paper describes and summarizes the development and in-flight performance of this pressure-fed, nitrogen tetroxide/monomethyl hydrazine bipropellant system. The design of all Mariner propulsion subsystems has been predicted upon the premise that simplicity of approach, coupled with thorough qualification and margin-limits testing, is the key to cost-effective reliability. The qualification test program and analytical modeling are also discussed. Since the propulsion subsystem is modular in nature, it was completely checked, serviced, and tested independent of the spacecraft. Proper prediction of in-flight performance required the development of three significant modeling tools to predict and account for nitrogen saturation of the propellant during the six-month coast period and to predict and statistically analyze in-flight data.
Rendall, Michael S.; Ghosh-Dastidar, Bonnie; Weden, Margaret M.; Baker, Elizabeth H.; Nazarov, Zafar
2013-01-01
Within-survey multiple imputation (MI) methods are adapted to pooled-survey regression estimation where one survey has more regressors, but typically fewer observations, than the other. This adaptation is achieved through: (1) larger numbers of imputations to compensate for the higher fraction of missing values; (2) model-fit statistics to check the assumption that the two surveys sample from a common universe; and (3) specificying the analysis model completely from variables present in the survey with the larger set of regressors, thereby excluding variables never jointly observed. In contrast to the typical within-survey MI context, cross-survey missingness is monotonic and easily satisfies the Missing At Random (MAR) assumption needed for unbiased MI. Large efficiency gains and substantial reduction in omitted variable bias are demonstrated in an application to sociodemographic differences in the risk of child obesity estimated from two nationally-representative cohort surveys. PMID:24223447
Exact and Approximate Probabilistic Symbolic Execution
NASA Technical Reports Server (NTRS)
Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem
2014-01-01
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.
Maximization of fructose esters synthesis by response surface methodology.
Neta, Nair Sampaio; Peres, António M; Teixeira, José A; Rodrigues, Ligia R
2011-07-01
Enzymatic synthesis of fructose fatty acid ester was performed in organic solvent media, using a purified lipase from Candida antartica B immobilized in acrylic resin. Response surface methodology with a central composite rotatable design based on five levels was implemented to optimize three experimental operating conditions (temperature, agitation and reaction time). A statistical significant cubic model was established. Temperature and reaction time were found to be the most significant parameters. The optimum operational conditions for maximizing the synthesis of fructose esters were 57.1°C, 100 rpm and 37.8 h. The model was validated in the identified optimal conditions to check its adequacy and accuracy, and an experimental esterification percentage of 88.4% (±0.3%) was obtained. These results showed that an improvement of the enzymatic synthesis of fructose esters was obtained under the optimized conditions. Copyright © 2011 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leppik, P.A.
This paper presents results of a study designed to confirm that the interaction of the neutron flux and the coolant flow plays an important role in the mechanism of high-frequency (HF) resonant instability of the VK-50 boiling water reactor. To do this and to check the working model, signals from probes measuring the flow rate of the coolant and the neutron flux were recorded simultaneously (with the help of a magnetograph) in experiments performed in 1981 on driving the VK-50 reactor into the HF reonant instability regimes. Estimates were then obtained for the statistical characteristics of the pulsations of themore » flow rate and of the neutron flux, including the cross-correlation functions and coherence functions. The basic results of these studies are reported here.« less
Preschoolers' Choice: Tofu or Potato Chips?
ERIC Educational Resources Information Center
Jacobson, Linda
2004-01-01
Statistics show a growing need for American children to develop healthy nutrition habits and make physical activity a routine part of their day. This article discusses the importance of knowledge about lessons on healthy lifestyles in preschool population to check obesity. By targeting preschoolers, the New York City obesity-prevention program and…
Analyzing Randomized Controlled Interventions: Three Notes for Applied Linguists
ERIC Educational Resources Information Center
Vanhove, Jan
2015-01-01
I discuss three common practices that obfuscate or invalidate the statistical analysis of randomized controlled interventions in applied linguistics. These are (a) checking whether randomization produced groups that are balanced on a number of possibly relevant covariates, (b) using repeated measures ANOVA to analyze pretest-posttest designs, and…
Thermochemistry of Gaseous Compounds of Metals.
1981-03-01
22.6 -11.7 signal to displacement of the molecular beam defining 2018 22.7 -11.8 slit was checked to ascertain the effusion cell origin. 2024 19.6...neligile. pilation by llultgren et al. 38 Only the electronic Fround states of the Lanthanide monoxides were considered, and the statistical weights
ERIC Educational Resources Information Center
Chou, Yeh-Tai; Wang, Wen-Chung
2010-01-01
Dimensionality is an important assumption in item response theory (IRT). Principal component analysis on standardized residuals has been used to check dimensionality, especially under the family of Rasch models. It has been suggested that an eigenvalue greater than 1.5 for the first eigenvalue signifies a violation of unidimensionality when there…
Stress analysis of 27% scale model of AH-64 main rotor hub
NASA Technical Reports Server (NTRS)
Hodges, R. V.
1985-01-01
Stress analysis of an AH-64 27% scale model rotor hub was performed. Component loads and stresses were calculated based upon blade root loads and motions. The static and fatigue analysis indicates positive margins of safety in all components checked. Using the format developed here, the hub can be stress checked for future application.
Do alcohol compliance checks decrease underage sales at neighboring establishments?
Erickson, Darin J; Smolenski, Derek J; Toomey, Traci L; Carlin, Bradley P; Wagenaar, Alexander C
2013-11-01
Underage alcohol compliance checks conducted by law enforcement agencies can reduce the likelihood of illegal alcohol sales at checked alcohol establishments, and theory suggests that an alcohol establishment that is checked may warn nearby establishments that compliance checks are being conducted in the area. In this study, we examined whether the effects of compliance checks diffuse to neighboring establishments. We used data from the Complying with the Minimum Drinking Age trial, which included more than 2,000 compliance checks conducted at more than 900 alcohol establishments. The primary outcome was the sale of alcohol to a pseudo-underage buyer without the need for age identification. A multilevel logistic regression was used to model the effect of a compliance check at each establishment as well as the effect of compliance checks at neighboring establishments within 500 m (stratified into four equal-radius concentric rings), after buyer, license, establishment, and community-level variables were controlled for. We observed a decrease in the likelihood of establishments selling alcohol to underage youth after they had been checked by law enforcement, but these effects quickly decayed over time. Establishments that had a close neighbor (within 125 m) checked in the past 90 days were also less likely to sell alcohol to young-appearing buyers. The spatial effect of compliance checks on other establishments decayed rapidly with increasing distance. Results confirm the hypothesis that the effects of police compliance checks do spill over to neighboring establishments. These findings have implications for the development of an optimal schedule of police compliance checks.
NASA Astrophysics Data System (ADS)
Hacker, Joshua; Vandenberghe, Francois; Jung, Byoung-Jo; Snyder, Chris
2017-04-01
Effective assimilation of cloud-affected radiance observations from space-borne imagers, with the aim of improving cloud analysis and forecasting, has proven to be difficult. Large observation biases, nonlinear observation operators, and non-Gaussian innovation statistics present many challenges. Ensemble-variational data assimilation (EnVar) systems offer the benefits of flow-dependent background error statistics from an ensemble, and the ability of variational minimization to handle nonlinearity. The specific benefits of ensemble statistics, relative to static background errors more commonly used in variational systems, have not been quantified for the problem of assimilating cloudy radiances. A simple experiment framework is constructed with a regional NWP model and operational variational data assimilation system, to provide the basis understanding the importance of ensemble statistics in cloudy radiance assimilation. Restricting the observations to those corresponding to clouds in the background forecast leads to innovations that are more Gaussian. The number of large innovations is reduced compared to the more general case of all observations, but not eliminated. The Huber norm is investigated to handle the fat tails of the distributions, and allow more observations to be assimilated without the need for strict background checks that eliminate them. Comparing assimilation using only ensemble background error statistics with assimilation using only static background error statistics elucidates the importance of the ensemble statistics. Although the cost functions in both experiments converge to similar values after sufficient outer-loop iterations, the resulting cloud water, ice, and snow content are greater in the ensemble-based analysis. The subsequent forecasts from the ensemble-based analysis also retain more condensed water species, indicating that the local environment is more supportive of clouds. In this presentation we provide details that explain the apparent benefit from using ensembles for cloudy radiance assimilation in an EnVar context.
Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco
NASA Astrophysics Data System (ADS)
Bounoua, Z.; Mechaqrane, A.
2018-05-01
An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Economic inequality and mobility in kinetic models for social sciences
NASA Astrophysics Data System (ADS)
Letizia Bertotti, Maria; Modanese, Giovanni
2016-10-01
Statistical evaluations of the economic mobility of a society are more difficult than measurements of the income distribution, because they require to follow the evolution of the individuals' income for at least one or two generations. In micro-to-macro theoretical models of economic exchanges based on kinetic equations, the income distribution depends only on the asymptotic equilibrium solutions, while mobility estimates also involve the detailed structure of the transition probabilities of the model, and are thus an important tool for assessing its validity. Empirical data show a remarkably general negative correlation between economic inequality and mobility, whose explanation is still unclear. It is therefore particularly interesting to study this correlation in analytical models. In previous work we investigated the behavior of the Gini inequality index in kinetic models in dependence on several parameters which define the binary interactions and the taxation and redistribution processes: saving propensity, taxation rates gap, tax evasion rate, welfare means-testing etc. Here, we check the correlation of mobility with inequality by analyzing the mobility dependence from the same parameters. According to several numerical solutions, the correlation is confirmed to be negative.
Requeno, José Ignacio; Colom, José Manuel
2014-12-01
Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.
Requeno, José Ignacio; Colom, José Manuel
2014-10-23
Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.
Model Checking for Verification of Interactive Health IT Systems
Butler, Keith A.; Mercer, Eric; Bahrami, Ali; Tao, Cui
2015-01-01
Rigorous methods for design and verification of health IT systems have lagged far behind their proliferation. The inherent technical complexity of healthcare, combined with the added complexity of health information technology makes their resulting behavior unpredictable and introduces serious risk. We propose to mitigate this risk by formalizing the relationship between HIT and the conceptual work that increasingly typifies modern care. We introduce new techniques for modeling clinical workflows and the conceptual products within them that allow established, powerful modeling checking technology to be applied to interactive health IT systems. The new capability can evaluate the workflows of a new HIT system performed by clinicians and computers to improve safety and reliability. We demonstrate the method on a patient contact system to demonstrate model checking is effective for interactive systems and that much of it can be automated. PMID:26958166
Automated structure solution, density modification and model building.
Terwilliger, Thomas C
2002-11-01
The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.
Bertacche, Vittorio; Pini, Elena; Stradi, Riccardo; Stratta, Fabio
2006-01-01
The purpose of this study is the development of a quantification method to detect the amount of amorphous cyclosporine using Fourier transform infrared (FTIR) spectroscopy. The mixing of different percentages of crystalline cyclosporine with amorphous cyclosporine was used to obtain a set of standards, composed of cyclosporine samples characterized by different percentages of amorphous cyclosporine. Using a wavelength range of 450-4,000 cm(-1), FTIR spectra were obtained from samples in potassium bromide pellets and then a partial least squares (PLS) model was exploited to correlate the features of the FTIR spectra with the percentage of amorphous cyclosporine in the samples. This model gave a standard error of estimate (SEE) of 0.3562, with an r value of 0.9971 and a standard error of prediction (SEP) of 0.4168, which derives from the cross validation function used to check the precision of the model. Statistical values reveal the applicability of the method to the quantitative determination of amorphous cyclosporine in crystalline cyclosporine samples.
Influence of the plasma environment on atomic structure using an ion-sphere model
Belkhiri, Madeny Jean; Fontes, Christopher John; Poirier, Michel
2015-09-03
Plasma environment effects on atomic structure are analyzed using various atomic structure codes. To monitor the effect of high free-electron density or low temperatures, Fermi-Dirac and Maxwell-Boltzmann statistics are compared. After a discussion of the implementation of the Fermi-Dirac approach within the ion-sphere model, several applications are considered. In order to check the consistency of the modifications brought here to extant codes, calculations have been performed using the Los Alamos Cowan Atomic Structure (cats) code in its Hartree-Fock or Hartree-Fock-Slater form and the parametric potential Flexible Atomic Code (fac). The ground-state energy shifts due to the plasma effects for themore » six most ionized aluminum ions have been calculated using the fac and cats codes and fairly agree. For the intercombination resonance line in Fe 22+, the plasma effect within the uniform electron gas model results in a positive shift that agrees with the MCDF value of B. Saha et al.« less
Influence of the plasma environment on atomic structure using an ion-sphere model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belkhiri, Madeny Jean; Fontes, Christopher John; Poirier, Michel
Plasma environment effects on atomic structure are analyzed using various atomic structure codes. To monitor the effect of high free-electron density or low temperatures, Fermi-Dirac and Maxwell-Boltzmann statistics are compared. After a discussion of the implementation of the Fermi-Dirac approach within the ion-sphere model, several applications are considered. In order to check the consistency of the modifications brought here to extant codes, calculations have been performed using the Los Alamos Cowan Atomic Structure (cats) code in its Hartree-Fock or Hartree-Fock-Slater form and the parametric potential Flexible Atomic Code (fac). The ground-state energy shifts due to the plasma effects for themore » six most ionized aluminum ions have been calculated using the fac and cats codes and fairly agree. For the intercombination resonance line in Fe 22+, the plasma effect within the uniform electron gas model results in a positive shift that agrees with the MCDF value of B. Saha et al.« less
Correlating N2 and CH4 adsorption on microporous carbon using a new analytical model
Sun, Jielun; Chen, S.; Rood, M.J.; Rostam-Abadi, M.
1998-01-01
A new pore size distribution (PSD) model is developed to readily describe PSDs of microporous materials with an analytical expression. Results from this model can be used to calculate the corresponding adsorption isotherm to compare the calculated isotherm to the experimental isotherm. This aspect of the model provides another check on the validity of the model's results. The model is developed on the basis of a 3-D adsorption isotherm equation that is derived from statistical mechanical principles. Least-squares error minimization is used to solve the PSD without any preassumed distribution function. In comparison with several well-accepted analytical methods from the literature, this 3-D model offers a relatively realistic PSD description for select reference materials, including activated-carbon fibers. N2 and CH4 adsorption is correlated using the 3-D model for commercial carbons BPL and AX-21. Predicted CH4 adsorption isotherms at 296 K based on N2 adsorption at 77 K are in reasonable agreement with experimental CH4 isotherms. Use of the model is also described for characterizing PSDs of tire-derived activated carbons and coal-derived activated carbons for air-quality control applications.
The Application of Lidar to Synthetic Vision System Integrity
NASA Technical Reports Server (NTRS)
Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve
2003-01-01
One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.
Verification and Planning Based on Coinductive Logic Programming
NASA Technical Reports Server (NTRS)
Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal
2008-01-01
Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.
Jadhav, Vivek Dattatray; Motwani, Bhagwan K.; Shinde, Jitendra; Adhapure, Prasad
2017-01-01
Aims: The aim of this study was to evaluate the marginal fit and surface roughness of complete cast crowns made by a conventional and an accelerated casting technique. Settings and Design: This study was divided into three parts. In Part I, the marginal fit of full metal crowns made by both casting techniques in the vertical direction was checked, in Part II, the fit of sectional metal crowns in the horizontal direction made by both casting techniques was checked, and in Part III, the surface roughness of disc-shaped metal plate specimens made by both casting techniques was checked. Materials and Methods: A conventional technique was compared with an accelerated technique. In Part I of the study, the marginal fit of the full metal crowns as well as in Part II, the horizontal fit of sectional metal crowns made by both casting techniques was determined, and in Part III, the surface roughness of castings made with the same techniques was compared. Statistical Analysis Used: The results of the t-test and independent sample test do not indicate statistically significant differences in the marginal discrepancy detected between the two casting techniques. Results: For the marginal discrepancy and surface roughness, crowns fabricated with the accelerated technique were significantly different from those fabricated with the conventional technique. Conclusions: Accelerated casting technique showed quite satisfactory results, but the conventional technique was superior in terms of marginal fit and surface roughness. PMID:29042726
Bayesian model checking: A comparison of tests
NASA Astrophysics Data System (ADS)
Lucy, L. B.
2018-06-01
Two procedures for checking Bayesian models are compared using a simple test problem based on the local Hubble expansion. Over four orders of magnitude, p-values derived from a global goodness-of-fit criterion for posterior probability density functions agree closely with posterior predictive p-values. The former can therefore serve as an effective proxy for the difficult-to-calculate posterior predictive p-values.
NASA Astrophysics Data System (ADS)
Vijay Singh, Ran; Agilandeeswari, L.
2017-11-01
To handle the large amount of client’s data in open cloud lots of security issues need to be address. Client’s privacy should not be known to other group members without data owner’s valid permission. Sometime clients are fended to have accessing with open cloud servers due to some restrictions. To overcome the security issues and these restrictions related to storing, data sharing in an inter domain network and privacy checking, we propose a model in this paper which is based on an identity based cryptography in data transmission and intermediate entity which have client’s reference with identity that will take control handling of data transmission in an open cloud environment and an extended remote privacy checking technique which will work at admin side. On behalf of data owner’s authority this proposed model will give best options to have secure cryptography in data transmission and remote privacy checking either as private or public or instructed. The hardness of Computational Diffie-Hellman assumption algorithm for key exchange makes this proposed model more secure than existing models which are being used for public cloud environment.
Global map of lithosphere thermal thickness on a 1 deg x 1 deg grid - digitally available
NASA Astrophysics Data System (ADS)
Artemieva, Irina
2014-05-01
This presentation reports a 1 deg ×1 deg global thermal model for the continental lithosphere (TC1). The model is digitally available from the author's web-site: www.lithosphere.info. Geotherms for continental terranes of different ages (early Archean to present) are constrained by reliable data on borehole heat flow measurements (Artemieva and Mooney, 2001), checked with the original publications for data quality, and corrected for paleo-temperature effects where needed. These data are supplemented by cratonic geotherms based on xenolith data. Since heat flow measurements cover not more than half of the continents, the remaining areas (ca. 60% of the continents) are filled by the statistical numbers derived from the thermal model constrained by borehole data. Continental geotherms are statistically analyzed as a function of age and are used to estimate lithospheric temperatures in continental regions with no or low quality heat flow data. This analysis requires knowledge of lithosphere age globally. A compilation of tectono-thermal ages of lithospheric terranes on a 1 deg × 1 deg grid forms the basis for the statistical analysis. It shows that, statistically, lithospheric thermal thickness z (in km) depends on tectono-thermal age t (in Ma) as: z=0.04t+93.6. This relationship formed the basis for a global thermal model of the continental lithosphere (TC1). Statistical analysis of continental geotherms also reveals that this relationship holds for the Archean cratons in general, but not in detail. Particularly, thick (more than 250 km) lithosphere is restricted solely to young Archean terranes (3.0-2.6 Ga), while in old Archean cratons (3.6-3.0 Ga) lithospheric roots do not extend deeper than 200-220 km. The TC1 model is presented by a set of maps, which show significant thermal heterogeneity within continental upper mantle. The strongest lateral temperature variations (as large as 800 deg C) are typical of the shallow mantle (depth less than 100 km). A map of the depth to a 600 deg C isotherm in continental upper mantle is presented as a proxy to the elastic thickness of the cratonic lithosphere, in which flexural rigidity is dominated by olivine rheology of the mantle. The TC1 model of the lithosphere thickness is used to calculate the growth and preservation rates of the lithosphere since the Archean.
Spot-checks to measure general hygiene practice.
Sonego, Ina L; Mosler, Hans-Joachim
2016-01-01
A variety of hygiene behaviors are fundamental to the prevention of diarrhea. We used spot-checks in a survey of 761 households in Burundi to examine whether something we could call general hygiene practice is responsible for more specific hygiene behaviors, ranging from handwashing to sweeping the floor. Using structural equation modeling, we showed that clusters of hygiene behavior, such as primary caregivers' cleanliness and household cleanliness, explained the spot-check findings well. Within our model, general hygiene practice as overall concept explained the more specific clusters of hygiene behavior well. Furthermore, the higher general hygiene practice, the more likely children were to be categorized healthy (r = 0.46). General hygiene practice was correlated with commitment to hygiene (r = 0.52), indicating a strong association to psychosocial determinants. The results show that different hygiene behaviors co-occur regularly. Using spot-checks, the general hygiene practice of a household can be rated quickly and easily.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
Deformations resulting from the movements of a shear or tensile fault in an anisotropic half space
NASA Astrophysics Data System (ADS)
Sheu, Guang Y.
2004-04-01
Earlier solutions (Bull. Seismol. Soc. Amer. 1985; 75:1135-1154; Bull. Seismol. Soc. Amer. 1992; 82:1018-1040) of deformations caused by the movements of a shear or tensile fault in an isotropic half-space for finite rectangular sources of strain nucleus have been extended for a transversely isotropic half-space. Results of integrating previous solutions (Int. J. Numer. Anal. Meth. Geomech. 2001; 25(10): 1175-1193) of deformations due to a shear or tensile fault in a transversely isotropic half-space for point sources of strain nucleus over the fault plane are presented. In addition, a boundary element (BEM) model (POLY3D:A three-dimensional, polygonal element, displacement discontinuity boundary element computer program with applications to fractures, faults, and cavities in the Earth's crust. M.S. Thesis, Stanford University, Department of Geology, 1993; 62) is given. Different from similar researches (e.g. Thomas), the Akaike's view on Bayesian statistics (Akaike Information Criterion Statistics. D. Reidel Publication: Dordrecht, 1986) is applied for inverting deformations due to a fault to obtain displacement discontinuities on the fault plane.
Franceschi, Massimo; Caffarra, Paolo; Savarè, Rita; Cerutti, Renata; Grossi, Enzo
2011-01-01
The early differentiation of Alzheimer's disease (AD) from frontotemporal dementia (FTD) may be difficult. The Tower of London (ToL), thought to assess executive functions such as planning and visuo-spatial working memory, could help in this purpose. Twentytwo Dementia Centers consecutively recruited patients with early FTD or AD. ToL performances of these groups were analyzed using both the conventional statistical approaches and the Artificial Neural Networks (ANNs) modelling. Ninety-four non aphasic FTD and 160 AD patients were recruited. ToL Accuracy Score (AS) significantly (p < 0.05) differentiated FTD from AD patients. However, the discriminant validity of AS checked by ROC curve analysis, yielded no significant results in terms of sensitivity and specificity (AUC 0.63). The performances of the 12 Success Subscores (SS) together with age, gender and schooling years were entered into advanced ANNs developed by Semeion Institute. The best ANNs were selected and submitted to ROC curves. The non-linear model was able to discriminate FTD from AD with an average AUC for 7 independent trials of 0.82. The use of hidden information contained in the different items of ToL and the non linear processing of the data through ANNs allows a high discrimination between FTD and AD in individual patients.
Visually Evoked Potential Markers of Concussion History in Patients with Convergence Insufficiency
Poltavski, Dmitri; Lederer, Paul; Cox, Laurie Kopko
2017-01-01
ABSTRACT Purpose We investigated whether differences in the pattern visual evoked potentials exist between patients with convergence insufficiency and those with convergence insufficiency and a history of concussion using stimuli designed to differentiate between magnocellular (transient) and parvocellular (sustained) neural pathways. Methods Sustained stimuli included 2-rev/s, 85% contrast checkerboard patterns of 1- and 2-degree check sizes, whereas transient stimuli comprised 4-rev/s, 10% contrast vertical sinusoidal gratings with column width of 0.25 and 0.50 cycles/degree. We tested two models: an a priori clinical model based on an assumption of at least a minimal (beyond instrumentation’s margin of error) 2-millisecond lag of transient response latencies behind sustained response latencies in concussed patients and a statistical model derived from the sample data. Results Both models discriminated between concussed and nonconcussed groups significantly above chance (with 76% and 86% accuracy, respectively). In the statistical model, patients with mean vertical sinusoidal grating response latencies greater than 119 milliseconds to 0.25-cycle/degree stimuli (or mean vertical sinusoidal latencies >113 milliseconds to 0.50-cycle/degree stimuli) and mean vertical sinusoidal grating amplitudes of less than 14.75 mV to 0.50-cycle/degree stimuli were classified as having had a history of concussion. The resultant receiver operating characteristic curve for this model had excellent discrimination between the concussed and nonconcussed (area under the curve = 0.857; P < .01) groups with sensitivity of 0.92 and specificity of 0.80. Conclusions The results suggest a promising electrophysiological approach to identifying individuals with convergence insufficiency and a history of concussion. PMID:28609417
XMI2USE: A Tool for Transforming XMI to USE Specifications
NASA Astrophysics Data System (ADS)
Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.
The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.
Use of Longitudinal Regression in Quality Control. Research Report. ETS RR-14-31
ERIC Educational Resources Information Center
Lu, Ying; Yen, Wendy M.
2014-01-01
This article explores the use of longitudinal regression as a tool for identifying scoring inaccuracies. Student progression patterns, as evaluated through longitudinal regressions, typically are more stable from year to year than are scale score distributions and statistics, which require representative samples to conduct credibility checks.…
Report of the 64th National Conference on Weights and Measures
NASA Astrophysics Data System (ADS)
Wollin, H. F.; Babeoq, L. E.; Heffernan, A. P.
1980-03-01
Major issues discussed at this conference include metric conversion in the United States, particularly the conversion of gasoline dispensers, problems relating to the quantity fill of packaged commodities especially as affected by moisture loss and statistical approach to package checking. Federal grain inspection, and a legal metrology control system are also discussed.
Evaluation of properties over phylogenetic trees using stochastic logics.
Requeno, José Ignacio; Colom, José Manuel
2016-06-14
Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.
da Silva, Luiz Augusto; de Freitas, Leandro; Medeiros, Thiago Emannuel; Osiecki, Raul; Garcia Michel, Renan; Snak, André Luiz
2014-01-01
Objective: The study investigated the effect of supplementation with maltodextrin (CHO) alone or associated to caffeine during exercise in T2DM subjects. Methods: Pilot study, using eight subjects with T2DM, aged 55±10 years, received CHO (1 g/kg) or caffeine (1.5 mg/kg) alone or associated before exercise protocol. The exercise was executed at 40% heart rate (HR) reserve for 40 min, with 10-min recovery. Blood pressure (BP) and perceived exertion scale (Borg) were checked every 2 min. Blood glucose (BG) was checked every 10 min. For statistical analysis, ANOVA test was used and the value was considered statistically significant at p <0.05. Results: The results showed that BP and HR did not change significantly among all treatments. Caffeine promoted a significant reduction in BG of 75 mg/dL (65%, p <0.05) during 40 min of exercise protocol compared to all groups. Conclusion: Supplementation with 1.5 mg/kg of caffeine reduces BG concentration during prolonged exercise in T2DM patients. PMID:25100892
NASA Astrophysics Data System (ADS)
Onac, I.; Pop, L.; Ungur, Rodica; Giurgiu, Ioana
2001-06-01
We checked the changes occurring in the metabolism of proteins (seric cholinesterase, total proteins) and in the metabolism of glycosides (seric glucose) in Cavia cobaia. A simple blind study was carried out and the results were checked on the first, tenth and twentieth days of treatment. The data thus obtained were graphically represented and statistically processed according to the Duncan test. The technique and treatment doses were similar and they were compared with the data obtained from controls and environment controls. In the groups biostimulated with He-Ne laser, seric cholinesterase levels increased proportionally with the dose reaching a peak on day 10, which was not the case with the controls. Monochromatic red light caused a similar but quantitatively lower effect. The same results were obtained in the case of seric proteins as well, however, the effect did not depend on the dose and it was less significant statistically than in the case of seric cholinesterase both in laser treated and in monochromatic red light treated groups.
Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David
2017-11-15
Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.
Influence of visual angle on pattern reversal visual evoked potentials
Kothari, Ruchi; Singh, Smita; Singh, Ramji; Shukla, A. K.; Bokariya, Pradeep
2014-01-01
Purpose: The aim of this study was to find whether the visual evoked potential (VEP) latencies and amplitude are altered with different visual angles in healthy adult volunteers or not and to determine the visual angle which is the optimum and most appropriate among a wide range of check sizes for the reliable interpretation of pattern reversal VEPs (PRVEPs). Materials and Methods: The present study was conducted on 40 healthy volunteers. The subjects were divided into two groups. One group consisted of 20 individuals (nine males and 11 females) in the age range of 25-57 years and they were exposed to checks subtending a visual angle of 90, 120, and 180 minutes of arc. Another group comprised of 20 individuals (10 males and 10 females) in the age range of 36-60 years and they were subjected to checks subtending a visual angle of 15, 30, and 120 minutes of arc. The stimulus configuration comprised of the transient pattern reversal method in which a black and white checker board is generated (full field) on a VEP Monitor by an Evoked Potential Recorder (RMS EMG. EPMARK II). The statistical analysis was done by One Way Analysis of Variance (ANOVA) using EPI INFO 6. Results: In Group I, the maximum (max.) P100 latency of 98.8 ± 4.7 and the max. P100 amplitude of 10.05 ± 3.1 μV was obtained with checks of 90 minutes. In Group II, the max. P100 latency of 105.19 ± 4.75 msec as well as the max. P100 amplitude of 8.23 ± 3.30 μV was obtained with 15 minutes. The min. P100 latency in both the groups was obtained with checks of 120 minutes while the min. P100 amplitude was obtained with 180 minutes. A statistically significant difference was derived between means of P100 latency for 15 and 30 minutes with reference to its value for 120 minutes and between the mean value of P100 amplitude for 120 minutes and that of 90 and 180 minutes. Conclusion: Altering the size of stimulus (visual angle) has an effect on the PRVEP parameters. Our study found that the 120 is the appropriate (and optimal) check size that can be used for accurate interpretation of PRVEPs. This will help in better assessment of the optic nerve function and integrity of anterior visual pathways. PMID:25378875
Doreleijers, J F; Vriend, G; Raves, M L; Kaptein, R
1999-11-15
A statistical analysis is reported of 1,200 of the 1,404 nuclear magnetic resonance (NMR)-derived protein and nucleic acid structures deposited in the Protein Data Bank (PDB) before 1999. Excluded from this analysis were the entries not yet fully validated by the PDB and the more than 100 entries that contained < 95% of the expected hydrogens. The aim was to assess the geometry of the hydrogens in the remaining structures and to provide a check on their nomenclature. Deviations in bond lengths, bond angles, improper dihedral angles, and planarity with respect to estimated values were checked. More than 100 entries showed anomalous protonation states for some of their amino acids. Approximately 250,000 (1.7%) atom names differed from the consensus PDB nomenclature. Most of the inconsistencies are due to swapped prochiral labeling. Large deviations from the expected geometry exist for a considerable number of entries, many of which are average structures. The most common causes for these deviations seem to be poor minimization of average structures and an improper balance between force-field constraints for experimental and holonomic data. Some specific geometric outliers are related to the refinement programs used. A number of recommendations for biomolecular databases, modeling programs, and authors submitting biomolecular structures are given.
Kupek, Emil
2006-03-15
Structural equation modelling (SEM) has been increasingly used in medical statistics for solving a system of related regression equations. However, a great obstacle for its wider use has been its difficulty in handling categorical variables within the framework of generalised linear models. A large data set with a known structure among two related outcomes and three independent variables was generated to investigate the use of Yule's transformation of odds ratio (OR) into Q-metric by (OR-1)/(OR+1) to approximate Pearson's correlation coefficients between binary variables whose covariance structure can be further analysed by SEM. Percent of correctly classified events and non-events was compared with the classification obtained by logistic regression. The performance of SEM based on Q-metric was also checked on a small (N = 100) random sample of the data generated and on a real data set. SEM successfully recovered the generated model structure. SEM of real data suggested a significant influence of a latent confounding variable which would have not been detectable by standard logistic regression. SEM classification performance was broadly similar to that of the logistic regression. The analysis of binary data can be greatly enhanced by Yule's transformation of odds ratios into estimated correlation matrix that can be further analysed by SEM. The interpretation of results is aided by expressing them as odds ratios which are the most frequently used measure of effect in medical statistics.
NASA Astrophysics Data System (ADS)
Kurnia, H.; Noerhadi, N. A. I.
2017-08-01
Three-dimensional digital study models were introduced following advances in digital technology. This study was carried out to assess the reliability of digital study models scanned by a laser scanning device newly assembled. The aim of this study was to compare the digital study models and conventional models. Twelve sets of dental impressions were taken from patients with mild-to-moderate crowding. The impressions were taken twice, one with alginate and the other with polyvinylsiloxane. The alginate impressions were made into conventional models, and the polyvinylsiloxane impressions were scanned to produce digital models. The mesiodistal tooth width and Little’s irregularity index (LII) were measured manually with digital calipers on the conventional models and digitally on the digital study models. Bolton analysis was performed on each study models. Each method was carried out twice to check for intra-observer variability. The reproducibility (comparison of the methods) was assessed using independent-sample t-tests. The mesiodistal tooth width between conventional and digital models did not significantly differ (p > 0.05). Independent-sample t-tests did not identify statistically significant differences for Bolton analysis and LII (p = 0.603 for Bolton and p = 0894 for LII). The measurements of the digital study models are as accurate as those of the conventional models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, Kristin C; Brunhart-Lupo, Nicholas J; Bush, Brian W
We have developed a framework for the exploration, design, and planning of energy systems that combines interactive visualization with machine-learning based approximations of simulations through a general purpose dataflow API. Our system provides a visual inter- face allowing users to explore an ensemble of energy simulations representing a subset of the complex input parameter space, and spawn new simulations to 'fill in' input regions corresponding to new enegery system scenarios. Unfortunately, many energy simula- tions are far too slow to provide interactive responses. To support interactive feedback, we are developing reduced-form models via machine learning techniques, which provide statistically soundmore » esti- mates of the full simulations at a fraction of the computational cost and which are used as proxies for the full-form models. Fast com- putation and an agile dataflow enhance the engagement with energy simulations, and allow researchers to better allocate computational resources to capture informative relationships within the system and provide a low-cost method for validating and quality-checking large-scale modeling efforts.« less
2014-01-01
Background Our objective was to evaluate the measurement properties of the Pain Stages of Change Questionnaire (PSOCQ) and its four subscales Precontemplation, Contemplation, Action and Maintenance. Methods A total of 231 patients, median age 42 years, with chronic musculoskeletal pain responded to the 30 items in PSOCQ. Thresholds for item scores, and unidimensionality and invariance of the PSOCQ and its four subscales were evaluated by Rasch analysis, partial credit model. Results The items had disordered threshold and needed to be rescored. The 30 items in the PSOCQ did not fit the Rasch model Chi- square item trait statistics. All subscales fitted the Rasch models. The associations to pain (11 point numeric rating scale), emotional distress (Hopkins symptom check list v 25) and self-efficacy (Arthritis Self-Efficacy Scale) were highest for the Precontemplation subscale. Conclusion The present analysis revealed that all four subscales in PSOCQ fitted the Rasch model. No common construct for all subscales were identified, but the Action and Maintenance subscales were closely related. PMID:24646065
Eljamel, M Sam; Mahboob, Syed Osama
2016-12-01
Surgical resection of high-grade gliomas (HGG) is standard therapy because it imparts significant progression free (PFS) and overall survival (OS). However, HGG-tumor margins are indistinguishable from normal brain during surgery. Hence intraoperative technology such as fluorescence (ALA, fluorescein) and intraoperative ultrasound (IoUS) and MRI (IoMRI) has been deployed. This study compares the effectiveness and cost-effectiveness of these technologies. Critical literature review and meta-analyses, using MEDLINE/PubMed service. The list of references in each article was double-checked for any missing references. We included all studies that reported the use of ALA, fluorescein (FLCN), IoUS or IoMRI to guide HGG-surgery. The meta-analyses were conducted according to statistical heterogeneity between studies. If there was no heterogeneity, fixed effects model was used; otherwise, a random effects model was used. Statistical heterogeneity was explored by χ 2 and inconsistency (I 2 ) statistics. To assess cost-effectiveness, we calculated the incremental cost per quality-adjusted life-year (QALY). Gross total resection (GTR) after ALA, FLCN, IoUS and IoMRI was 69.1%, 84.4%, 73.4% and 70% respectively. The differences were not statistically significant. All four techniques led to significant prolongation of PFS and tended to prolong OS. However none of these technologies led to significant prolongation of OS compared to controls. The cost/QALY was $16,218, $3181, $6049 and $32,954 for ALA, FLCN, IoUS and IoMRI respectively. ALA, FLCN, IoUS and IoMRI significantly improve GTR and PFS of HGG. Their incremental cost was below the threshold for cost-effectiveness of HGG-therapy, denoting that each intraoperative technology was cost-effective on its own. Copyright © 2016 Elsevier B.V. All rights reserved.
[Path analysis of lifestyle habits to the metabolic syndrome].
Zhu, Zhen-xin; Zhang, Cheng-qi; Tang, Fang; Song, Xin-hong; Xue, Fu-zhong
2013-04-01
To evaluate the relationship between lifestyle habits and the components of metabolic syndrome (MS). Based on the routine health check-up system in a certain Center for Health Management of Shandong Province, a longitudinal surveillance health check-up cohort from 2005 to 2010 was set up. There were 13 225 urban workers in Jinan included in the analysis. The content of the survey included demographic information, medical history, lifestyle habits, body mass index (BMI) and the level of blood pressure, fasting blood-glucose, and blood lipid, etc. The distribution of BMI, blood pressure, fasting blood-glucose, blood lipid and lifestyle habits between MS patients and non-MS population was compared, latent variables were extracted by exploratory factor analysis to determine the structure model, and then a partial least squares path model was constructed between lifestyle habits and the components of MS. Participants'age was (46.62 ± 12.16) years old. The overall prevalence of the MS was 22.43% (2967/13 225), 26.49% (2535/9570) in males and 11.82% (432/3655) in females. The prevalence of the MS was statistically different between males and females (χ(2) = 327.08, P < 0.01). Between MS patients and non-MS population, the difference of dietary habits was statistically significant (χ(2) = 166.31, P < 0.01) in MS patients, the rate of vegetarian, mixed and animal food was 23.39% (694/2967), 42.50% (1261/2967) and 34.11% (1012/2967) respectively, while in non-MS population was 30.80% (3159/10 258), 46.37% (4757/10 258), 22.83% (2342/10 258) respectively. Their alcohol consumption has statistical difference (χ(2) = 374.22, P < 0.01) in MS patients, the rate of never or past, occasional and regular drinking was 27.37% (812/2967), 24.71% (733/2967), 47.93% (1422/2967) respectively, and in non-MS population was 39.60% (4062/10 258), 31.36% (3217/10 258), 29.04% (2979/10 258) respectively. The difference of their smoking status was statistically significant (χ(2) = 115.86, P < 0.01) in MS patients, the rate of never or past, occasional and regular smoking was 59.72% (1772/2967), 6.24% (185/2967), 34.04% (1010/2967) respectively, while in non-MS population was 70.03% (7184/10 258), 5.35% (549/10 258), 24.61% (2525/10 258) respectively. Both lifestyle habits and the components of MS were attributable to only one latent variable. After adjustment for age and gender, the path coefficient between the latent component of lifestyle habits and the latent component of MS was 0.22 with statistical significance (t = 6.46, P < 0.01) through bootstrap test. Reliability and validity of the model:the lifestyle latent variable: average variance extracted was 0.53, composite reliability was 0.77 and Cronbach's a was 0.57. The MS latent variable: average variance extracted was 0.45, composite reliability was 0.76 and Cronbach's a was 0.59. Unhealthy lifestyle habits are closely related to MS. Meat diet, excessive drinking and smoking are risk factors for MS.
Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility
NASA Technical Reports Server (NTRS)
Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd
1999-01-01
We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.
Logistic regression for risk factor modelling in stuttering research.
Reed, Phil; Wu, Yaqionq
2013-06-01
To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.
A Model-Driven Approach for Telecommunications Network Services Definition
NASA Astrophysics Data System (ADS)
Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.
Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.
Proposed standby gasoline rationing plan: public comments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-12-01
Under the proposed plan, DOE would allocate ration rights (rights to purchase gasoline) to owners of registered vehicles. All vehicles in a given class would receive the same entitlement. Essential services would receive supplemental allotments of ration rights as pririty firms. Once every 3 months, ration checks would be mailed out to all vehicle registrants, allotting them a certain amount of ration rights. These checks would then be cashed at Coupon Issuance Points, where the bearer would receive ration coupons to be used at gasoline stations. Large users of gasoline could deposit their allotment checks in accounts at ration banks.more » Coupons or checks would be freely exchangeable in a white market. A certain percentage of the gasoline supply would be set aside in reserve for use in national emergencies. When the plan was published in the Federal Register, public comments were requested. DOE also solicited comments from private citizens, public interest groups, business and industry, state and local governments. A total of 1126 responses were reveived and these are analyzed in this paper. The second part of the report describes how the comments were classified, and gives a statistical breakdown of the major responses. The last section is a discussion and analysis of theissue raised by commenting agencies, firms, associations, and individuals. (MCW)« less
2014-06-19
urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n
Compact disk error measurements
NASA Technical Reports Server (NTRS)
Howe, D.; Harriman, K.; Tehranchi, B.
1993-01-01
The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.
Spatial distribution of psychotic disorders in an urban area of France: an ecological study.
Pignon, Baptiste; Schürhoff, Franck; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Saba, Ghassen; Leboyer, Marion; Kirkbride, James B; Szöke, Andrei
2016-05-18
Previous analyses of neighbourhood variations of non-affective psychotic disorders (NAPD) have focused mainly on incidence. However, prevalence studies provide important insights on factors associated with disease evolution as well as for healthcare resource allocation. This study aimed to investigate the distribution of prevalent NAPD cases in an urban area in France. The number of cases in each neighbourhood was modelled as a function of potential confounders and ecological variables, namely: migrant density, economic deprivation and social fragmentation. This was modelled using statistical models of increasing complexity: frequentist models (using Poisson and negative binomial regressions), and several Bayesian models. For each model, assumptions validity were checked and compared as to how this fitted to the data, in order to test for possible spatial variation in prevalence. Data showed significant overdispersion (invalidating the Poisson regression model) and residual autocorrelation (suggesting the need to use Bayesian models). The best Bayesian model was Leroux's model (i.e. a model with both strong correlation between neighbouring areas and weaker correlation between areas further apart), with economic deprivation as an explanatory variable (OR = 1.13, 95% CI [1.02-1.25]). In comparison with frequentist methods, the Bayesian model showed a better fit. The number of cases showed non-random spatial distribution and was linked to economic deprivation.
Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio
2011-12-01
The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.
Bayesian logistic regression approaches to predict incorrect DRG assignment.
Suleiman, Mani; Demirhan, Haydar; Boyd, Leanne; Girosi, Federico; Aksakalli, Vural
2018-05-07
Episodes of care involving similar diagnoses and treatments and requiring similar levels of resource utilisation are grouped to the same Diagnosis-Related Group (DRG). In jurisdictions which implement DRG based payment systems, DRGs are a major determinant of funding for inpatient care. Hence, service providers often dedicate auditing staff to the task of checking that episodes have been coded to the correct DRG. The use of statistical models to estimate an episode's probability of DRG error can significantly improve the efficiency of clinical coding audits. This study implements Bayesian logistic regression models with weakly informative prior distributions to estimate the likelihood that episodes require a DRG revision, comparing these models with each other and to classical maximum likelihood estimates. All Bayesian approaches had more stable model parameters than maximum likelihood. The best performing Bayesian model improved overall classification per- formance by 6% compared to maximum likelihood, with a 34% gain compared to random classification, respectively. We found that the original DRG, coder and the day of coding all have a significant effect on the likelihood of DRG error. Use of Bayesian approaches has improved model parameter stability and classification accuracy. This method has already lead to improved audit efficiency in an operational capacity.
Analysis of 3d Building Models Accuracy Based on the Airborne Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Ostrowski, W.; Pilarska, M.; Charyton, J.; Bakuła, K.
2018-05-01
Creating 3D building models in large scale is becoming more popular and finds many applications. Nowadays, a wide term "3D building models" can be applied to several types of products: well-known CityGML solid models (available on few Levels of Detail), which are mainly generated from Airborne Laser Scanning (ALS) data, as well as 3D mesh models that can be created from both nadir and oblique aerial images. City authorities and national mapping agencies are interested in obtaining the 3D building models. Apart from the completeness of the models, the accuracy aspect is also important. Final accuracy of a building model depends on various factors (accuracy of the source data, complexity of the roof shapes, etc.). In this paper the methodology of inspection of dataset containing 3D models is presented. The proposed approach check all building in dataset with comparison to ALS point clouds testing both: accuracy and level of details. Using analysis of statistical parameters for normal heights for reference point cloud and tested planes and segmentation of point cloud provides the tool that can indicate which building and which roof plane in do not fulfill requirement of model accuracy and detail correctness. Proposed method was tested on two datasets: solid and mesh model.
Parallel Software Model Checking
2015-01-08
checker. This project will explore this strategy to parallelize the generalized PDR algorithm for software model checking. It belongs to TF1 due to its ... focus on formal verification . Generalized PDR. Generalized Property Driven Rechability (GPDR) i is an algorithm for solving HORN-SMT reachability...subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 08
Stochastic Game Analysis and Latency Awareness for Self-Adaptation
2014-01-01
this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to
Population Synthesis of Radio & Gamma-Ray Millisecond Pulsars
NASA Astrophysics Data System (ADS)
Frederick, Sara; Gonthier, P. L.; Harding, A. K.
2014-01-01
In recent years, the number of known gamma-ray millisecond pulsars (MSPs) in the Galactic disk has risen substantially thanks to confirmed detections by Fermi Gamma-ray Space Telescope (Fermi). We have developed a new population synthesis of gamma-ray and radio MSPs in the galaxy which uses Markov Chain Monte Carlo techniques to explore the large and small worlds of the model parameter space and allows for comparisons of the simulated and detected MSP distributions. The simulation employs empirical radio and gamma-ray luminosity models that are dependent upon the pulsar period and period derivative with freely varying exponents. Parameters associated with the birth distributions are also free to vary. The computer code adjusts the magnitudes of the model luminosities to reproduce the number of MSPs detected by a group of ten radio surveys, thus normalizing the simulation and predicting the MSP birth rates in the Galaxy. Computing many Markov chains leads to preferred sets of model parameters that are further explored through two statistical methods. Marginalized plots define confidence regions in the model parameter space using maximum likelihood methods. A secondary set of confidence regions is determined in parallel using Kuiper statistics calculated from comparisons of cumulative distributions. These two techniques provide feedback to affirm the results and to check for consistency. Radio flux and dispersion measure constraints have been imposed on the simulated gamma-ray distributions in order to reproduce realistic detection conditions. The simulated and detected distributions agree well for both sets of radio and gamma-ray pulsar characteristics, as evidenced by our various comparisons.
Probabilistic Priority Message Checking Modeling Based on Controller Area Networks
NASA Astrophysics Data System (ADS)
Lin, Cheng-Min
Although the probabilistic model checking tool called PRISM has been applied in many communication systems, such as wireless local area network, Bluetooth, and ZigBee, the technique is not used in a controller area network (CAN). In this paper, we use PRISM to model the mechanism of priority messages for CAN because the mechanism has allowed CAN to become the leader in serial communication for automobile and industry control. Through modeling CAN, it is easy to analyze the characteristic of CAN for further improving the security and efficiency of automobiles. The Markov chain model helps us to model the behaviour of priority messages.
ERIC Educational Resources Information Center
Hatami, Gissou; Motamed, Niloofar; Ashrafzadeh, Mahshid
2010-01-01
Validity and reliability of Persian adaptation of MSLSS in the 12-18 years, middle and high school students (430 students in grades 6-12 in Bushehr port, Iran) using confirmatory factor analysis by means of LISREL statistical package were checked. Internal consistency reliability estimates (Cronbach's coefficient [alpha]) were all above the…
Collaboration of Students and Faculty Creating a Web-Site Based for Homework.
ERIC Educational Resources Information Center
Packard, Abbot L.; Holmes, Glen A.
This paper chronicles the building of a student based Web site method of quickly getting homework graded and back to the students with feedback. A Web site-supported statistics class offers an opportunity for students to check answers, get immediate feedback, and submit homework. A web-based support system should provide assistant for students of…
ERIC Educational Resources Information Center
Gjevik, Elen; Sandstad, Berit; Andreassen, Ole A.; Myhre, Anne M.; Sponheim, Eili
2015-01-01
Autism spectrum disorders are often comorbid with other psychiatric symptoms and disorders. However, identifying psychiatric comorbidity in children with autism spectrum disorders is challenging. We explored how a questionnaire, the Child Behavior Check List, agreed with a "Diagnostic and Statistical Manual of Mental Disorders-Fourth…
NASA Astrophysics Data System (ADS)
Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; De Mascellis, Roberto; Manna, Piero; Terribile, Fabio
2016-04-01
WeatherProg is a computer program for the semi-automatic handling of data measured at ground stations within a climatic network. The program performs a set of tasks ranging from gathering raw point-based sensors measurements to the production of digital climatic maps. Originally the program was developed as the baseline asynchronous engine for the weather records management within the SOILCONSWEB Project (LIFE08 ENV/IT/000408), in which daily and hourly data where used to run water balance in the soil-plant-atmosphere continuum or pest simulation models. WeatherProg can be configured to automatically perform the following main operations: 1) data retrieval; 2) data decoding and ingestion into a database (e.g. SQL based); 3) data checking to recognize missing and anomalous values (using a set of differently combined checks including logical, climatological, spatial, temporal and persistence checks); 4) infilling of data flagged as missing or anomalous (deterministic or statistical methods); 5) spatial interpolation based on alternative/comparative methods such as inverse distance weighting, iterative regression kriging, and a weighted least squares regression (based on physiography), using an approach similar to PRISM. 6) data ingestion into a geodatabase (e.g. PostgreSQL+PostGIS or rasdaman). There is an increasing demand for digital climatic maps both for research and development (there is a gap between the major of scientific modelling approaches that requires digital climate maps and the gauged measurements) and for practical applications (e.g. the need to improve the management of weather records which in turn raises the support provided to farmers). The demand is particularly burdensome considering the requirement to handle climatic data at the daily (e.g. in the soil hydrological modelling) or even at the hourly time step (e.g. risk modelling in phytopathology). The key advantage of WeatherProg is the ability to perform all the required operations and calculations in an automatic fashion, except the need of a human interaction upon specific issues (such as the decision whether a measurement is an anomaly or not according to the detected temporal and spatial variations with contiguous points). The presented computer program runs from command line and shows peculiar characteristics in the cascade modelling within different contexts belonging to agriculture, phytopathology and environment. In particular, it can be a powerful tool to set up cutting-edge regional web services based on weather information. Indeed, it can support territorial agencies in charge of meteorological and phytopathological bulletins.
Using the Benford's Law as a First Step to Assess the Quality of the Cancer Registry Data.
Crocetti, Emanuele; Randi, Giorgia
2016-01-01
Benford's law states that the distribution of the first digit different from 0 [first significant digit (FSD)] in many collections of numbers is not uniform. The aim of this study is to evaluate whether population-based cancer incidence rates follow Benford's law, and if this can be used in their data quality check process. We sampled 43 population-based cancer registry populations (CRPs) from the Cancer Incidence in 5 Continents-volume X (CI5-X). The distribution of cancer incidence rate FSD was evaluated overall, by sex, and by CRP. Several statistics, including Pearson's coefficient of correlation and distance measures, were applied to check the adherence to the Benford's law. In the whole dataset (146,590 incidence rates) and for each sex (70,722 male and 75,868 female incidence rates), the FSD distributions were Benford-like. The coefficient of correlation between observed and expected FSD distributions was extremely high (0.999), and the distance measures low. Considering single CRP (from 933 to 7,222 incidence rates), the results were in agreement with the Benford's law, and only a few CRPs showed possible discrepancies from it. This study demonstrated for the first time that cancer incidence rates follow Benford's law. This characteristic can be used as a new, simple, and objective tool in data quality evaluation. The analyzed data had been already checked for publication in CI5-X. Therefore, their quality was expected to be good. In fact, only for a few CRPs several statistics were consistent with possible violations.
Braun, T; Dochtermann, S; Krause, E; Schmidt, M; Schorn, K; Hempel, J M
2011-09-01
The present study analyzes the best combination of frequencies for the calculation of mean hearing loss in pure tone threshold audiometry for correlation with hearing loss for numbers in speech audiometry, since the literature describes different calculation variations for plausibility checking in expertise. Three calculation variations, A (250, 500 and 1000 Hz), B (500 and 1000 Hz) and C (500, 1000 and 2000 Hz), were compared. Audiograms in 80 patients with normal hearing, 106 patients with hearing loss and 135 expertise patients were analyzed in a retrospective manner. Differences between mean pure tone audiometry thresholds and hearing loss for numbers were calculated and statistically compared separately for the right and the left ear in the three patient collectives. We found the calculation variation A to be the best combination of frequencies, since it yielded the smallest standard deviations while being statistically different to calculation variations B and C. The 1- and 2.58-fold standard deviation (representing 68.3% and 99.0% of all values) was ±4.6 and ±11.8 dB for calculation variation A in patients with hearing loss, respectively. For plausibility checking in expertise, the mean threshold from the frequencies 250, 500 and 1000 Hz should be compared to the hearing loss for numbers. The common recommendation reported by the literature to doubt plausibility when the difference of these values exceeds ±5 dB is too strict as shown by this study.
MOM: A meteorological data checking expert system in CLIPS
NASA Technical Reports Server (NTRS)
Odonnell, Richard
1990-01-01
Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.
Learning Assumptions for Compositional Verification
NASA Technical Reports Server (NTRS)
Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)
2002-01-01
Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.
Silva, Luiz Eduardo Virgilio; Lataro, Renata Maria; Castania, Jaci Airton; Silva, Carlos Alberto Aguiar; Salgado, Helio Cesar; Fazan, Rubens; Porta, Alberto
2017-08-01
Heart rate variability (HRV) has been extensively explored by traditional linear approaches (e.g., spectral analysis); however, several studies have pointed to the presence of nonlinear features in HRV, suggesting that linear tools might fail to account for the complexity of the HRV dynamics. Even though the prevalent notion is that HRV is nonlinear, the actual presence of nonlinear features is rarely verified. In this study, the presence of nonlinear dynamics was checked as a function of time scales in three experimental models of rats with different impairment of the cardiac control: namely, rats with heart failure (HF), spontaneously hypertensive rats (SHRs), and sinoaortic denervated (SAD) rats. Multiscale entropy (MSE) and refined MSE (RMSE) were chosen as the discriminating statistic for the surrogate test utilized to detect nonlinearity. Nonlinear dynamics is less present in HF animals at both short and long time scales compared with controls. A similar finding was found in SHR only at short time scales. SAD increased the presence of nonlinear dynamics exclusively at short time scales. Those findings suggest that a working baroreflex contributes to linearize HRV and to reduce the likelihood to observe nonlinear components of the cardiac control at short time scales. In addition, an increased sympathetic modulation seems to be a source of nonlinear dynamics at long time scales. Testing nonlinear dynamics as a function of the time scales can provide a characterization of the cardiac control complementary to more traditional markers in time, frequency, and information domains. NEW & NOTEWORTHY Although heart rate variability (HRV) dynamics is widely assumed to be nonlinear, nonlinearity tests are rarely used to check this hypothesis. By adopting multiscale entropy (MSE) and refined MSE (RMSE) as the discriminating statistic for the nonlinearity test, we show that nonlinear dynamics varies with time scale and the type of cardiac dysfunction. Moreover, as complexity metrics and nonlinearities provide complementary information, we strongly recommend using the test for nonlinearity as an additional index to characterize HRV. Copyright © 2017 the American Physiological Society.
Short- and Long-Term Earthquake Forecasts Based on Statistical Models
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner
2017-04-01
The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.
Generalized Symbolic Execution for Model Checking and Testing
NASA Technical Reports Server (NTRS)
Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)
2003-01-01
Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.
Model Checking Degrees of Belief in a System of Agents
NASA Technical Reports Server (NTRS)
Raimondi, Franco; Primero, Giuseppe; Rungta, Neha
2014-01-01
Reasoning about degrees of belief has been investigated in the past by a number of authors and has a number of practical applications in real life. In this paper we present a unified framework to model and verify degrees of belief in a system of agents. In particular, we describe an extension of the temporal-epistemic logic CTLK and we introduce a semantics based on interpreted systems for this extension. In this way, degrees of beliefs do not need to be provided externally, but can be derived automatically from the possible executions of the system, thereby providing a computationally grounded formalism. We leverage the semantics to (a) construct a model checking algorithm, (b) investigate its complexity, (c) provide a Java implementation of the model checking algorithm, and (d) evaluate our approach using the standard benchmark of the dining cryptographers. Finally, we provide a detailed case study: using our framework and our implementation, we assess and verify the situational awareness of the pilot of Air France 447 flying in off-nominal conditions.
Order statistics applied to the most massive and most distant galaxy clusters
NASA Astrophysics Data System (ADS)
Waizmann, J.-C.; Ettori, S.; Bartelmann, M.
2013-06-01
In this work, we present an analytic framework for calculating the individual and joint distributions of the nth most massive or nth highest redshift galaxy cluster for a given survey characteristic allowing us to formulate Λ cold dark matter (ΛCDM) exclusion criteria. We show that the cumulative distribution functions steepen with increasing order, giving them a higher constraining power with respect to the extreme value statistics. Additionally, we find that the order statistics in mass (being dominated by clusters at lower redshifts) is sensitive to the matter density and the normalization of the matter fluctuations, whereas the order statistics in redshift is particularly sensitive to the geometric evolution of the Universe. For a fixed cosmology, both order statistics are efficient probes of the functional shape of the mass function at the high-mass end. To allow a quick assessment of both order statistics, we provide fits as a function of the survey area that allow percentile estimation with an accuracy better than 2 per cent. Furthermore, we discuss the joint distributions in the two-dimensional case and find that for the combination of the largest and the second largest observation, it is most likely to find them to be realized with similar values with a broadly peaked distribution. When combining the largest observation with higher orders, it is more likely to find a larger gap between the observations and when combining higher orders in general, the joint probability density function peaks more strongly. Having introduced the theory, we apply the order statistical analysis to the Southpole Telescope (SPT) massive cluster sample and metacatalogue of X-ray detected clusters of galaxies catalogue and find that the 10 most massive clusters in the sample are consistent with ΛCDM and the Tinker mass function. For the order statistics in redshift, we find a discrepancy between the data and the theoretical distributions, which could in principle indicate a deviation from the standard cosmology. However, we attribute this deviation to the uncertainty in the modelling of the SPT survey selection function. In turn, by assuming the ΛCDM reference cosmology, order statistics can also be utilized for consistency checks of the completeness of the observed sample and of the modelling of the survey selection function.
Transformation and model choice for RNA-seq co-expression analysis.
Rau, Andrea; Maugis-Rabusseau, Cathy
2018-05-01
Although a large number of clustering algorithms have been proposed to identify groups of co-expressed genes from microarray data, the question of if and how such methods may be applied to RNA sequencing (RNA-seq) data remains unaddressed. In this work, we investigate the use of data transformations in conjunction with Gaussian mixture models for RNA-seq co-expression analyses, as well as a penalized model selection criterion to select both an appropriate transformation and number of clusters present in the data. This approach has the advantage of accounting for per-cluster correlation structures among samples, which can be strong in RNA-seq data. In addition, it provides a rigorous statistical framework for parameter estimation, an objective assessment of data transformations and number of clusters and the possibility of performing diagnostic checks on the quality and homogeneity of the identified clusters. We analyze four varied RNA-seq data sets to illustrate the use of transformations and model selection in conjunction with Gaussian mixture models. Finally, we propose a Bioconductor package coseq (co-expression of RNA-seq data) to facilitate implementation and visualization of the recommended RNA-seq co-expression analyses.
Manual hierarchical clustering of regional geochemical data using a Bayesian finite mixture model
Ellefsen, Karl J.; Smith, David
2016-01-01
Interpretation of regional scale, multivariate geochemical data is aided by a statistical technique called “clustering.” We investigate a particular clustering procedure by applying it to geochemical data collected in the State of Colorado, United States of America. The clustering procedure partitions the field samples for the entire survey area into two clusters. The field samples in each cluster are partitioned again to create two subclusters, and so on. This manual procedure generates a hierarchy of clusters, and the different levels of the hierarchy show geochemical and geological processes occurring at different spatial scales. Although there are many different clustering methods, we use Bayesian finite mixture modeling with two probability distributions, which yields two clusters. The model parameters are estimated with Hamiltonian Monte Carlo sampling of the posterior probability density function, which usually has multiple modes. Each mode has its own set of model parameters; each set is checked to ensure that it is consistent both with the data and with independent geologic knowledge. The set of model parameters that is most consistent with the independent geologic knowledge is selected for detailed interpretation and partitioning of the field samples.
Calibration plots for risk prediction models in the presence of competing risks.
Gerds, Thomas A; Andersen, Per K; Kattan, Michael W
2014-08-15
A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves for competing risks models based on jackknife pseudo-values that are combined with a nearest neighborhood smoother and a cross-validation approach to deal with all three problems. Copyright © 2014 John Wiley & Sons, Ltd.
Low-cost and high-speed optical mark reader based on an intelligent line camera
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin
2003-08-01
Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.
Test and Evaluation Report of the IVAC (Trademark) Vital Check Monitor Model 4000AEE
1992-02-01
AD-A248 834 111111 jIf+l l’ l USAARL Report No. 92-14 Test and Evaluation Report of the IVAC® Vital Check Monitor DTI ~cModel 4000AEE f ELECTE APR17...does not constitute an official Department of the Army endorsement or approval of the use ot such commercial items. Reviewed: DENNIS F . SHANAHAN LTC, MC...to 12.4 GHz) was scanned for emissions. The IVACO Model 4000AEE was operated with both ac and battery power. 2.10.3.2 The radiated susceptibility
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
Labeit, Alexander; Peinemann, Frank; Baker, Richard
2013-01-01
Objectives To analyse and compare the determinants of screening uptake for different National Health Service (NHS) health check-ups in the UK. Design Individual-level analysis of repeated cross-sectional surveys with balanced panel data. Setting The UK. Participants Individuals taking part in the British Household Panel Survey (BHPS), 1992–2008. Outcome measure Uptake of NHS health check-ups for cervical cancer screening, breast cancer screening, blood pressure checks, cholesterol tests, dental screening and eyesight tests. Methods Dynamic panel data models (random effects panel probit with initial conditions). Results Having had a health check-up 1 year before, and previously in accordance with the recommended schedule, was associated with higher uptake of health check-ups. Individuals who visited a general practitioner (GP) had a significantly higher uptake in 5 of the 6 health check-ups. Uptake was highest in the recommended age group for breast and cervical cancer screening. For all health check-ups, age had a non-linear relationship. Lower self-rated health status was associated with increased uptake of blood pressure checks and cholesterol tests; smoking was associated with decreased uptake of 4 health check-ups. The effects of socioeconomic variables differed for the different health check-ups. Ethnicity did not have a significant influence on any health check-up. Permanent household income had an influence only on eyesight tests and dental screening. Conclusions Common determinants for having health check-ups are age, screening history and a GP visit. Policy interventions to increase uptake should consider the central role of the GP in promoting screening examinations and in preserving a high level of uptake. Possible economic barriers to access for prevention exist for dental screening and eyesight tests, and could be a target for policy intervention. Trial registration This observational study was not registered. PMID:24366576
Malacarne, Mario; Nardin, Tiziana; Bertoldi, Daniela; Nicolini, Giorgio; Larcher, Roberto
2016-09-01
Commercial tannins from several botanical sources and with different chemical and technological characteristics are used in the food and winemaking industries. Different ways to check their botanical authenticity have been studied in the last few years, through investigation of different analytical parameters. This work proposes a new, effective approach based on the quantification of 6 carbohydrates, 7 polyalcohols, and 55 phenols. 87 tannins from 12 different botanical sources were analysed following a very simple sample preparation procedure. Using Forward Stepwise Discriminant Analysis, 3 statistical models were created based on sugars content, phenols concentration and combination of the two classes of compounds for the 8 most abundant categories (i.e. oak, grape seed, grape skin, gall, chestnut, quebracho, tea and acacia). The last approach provided good results in attributing tannins to the correct botanical origin. Validation, repeated 3 times on subsets of 10% of samples, confirmed the reliability of this model. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development and in-flight performance of the Mariner 9 spacecraft propulsion system
NASA Technical Reports Server (NTRS)
Evans, D. D.; Cannova, R. D.; Cork, M. J.
1972-01-01
On November 14, 1971, Mariner 9 was decelerated into orbit about Mars by a 1334-newton (300-lbf) liquid bipropellant propulsion system. The development and in-flight performance are described and summarized of this pressure-fed, nitrogen tetroxide/monomethyl hydrazine bipropellant system. The design of all Mariner propulsion subsystems has been predicated upon the premise that simplicity of approach, coupled with thorough qualification and margin-limits testing, is the key to cost-effective reliability. The qualification test program and analytical modeling of the Mariner 9 subsystem are discussed. Since the propulsion subsystem is modular in nature, it was completely checked, serviced, and tested independent of the spacecraft. Proper prediction of in-flight performance required the development of three significant modeling tools to predict and account for nitrogen saturation of the propellant during the six-month coast period and to predict and statistically analyze in-flight data. The flight performance of the subsystem was excellent, as were the performance prediction correlations. These correlations are presented.
Generating community-built tools for data sharing and analysis in environmental networks
Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David
2016-01-01
Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.
NASA Astrophysics Data System (ADS)
Kim, Sehjeong; Chang, Dong Eui
2017-06-01
There have been many studies of the border screening using a simple math model or a statistical analysis to investigate the ineffectiveness of border screening during 2003 and 2009 pandemics. However, the use of border screening is still a controversial issue. It is due to focusing only on the functionality of border screening without considering the timing to use. In this paper, we attempt to qualitatively answer whether the use of border screening is a desirable action during a disease pandemic. Thus, a novel mathematical model with a transition probability of status change during flight and border screening is developed. A condition to check a timing of the border screening is established in terms of a lower bound of the basic reproduction number. If the lower bound is greater than one, which indicates a pandemic, then the border screening may not be effective and the disease persists. In this case, a community level control strategy should be conducted.
Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi
2016-12-01
Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.
Reopen parameter regions in two-Higgs doublet models
NASA Astrophysics Data System (ADS)
Staub, Florian
2018-01-01
The stability of the electroweak potential is a very important constraint for models of new physics. At the moment, it is standard for Two-Higgs doublet models (THDM), singlet or triplet extensions of the standard model to perform these checks at tree-level. However, these models are often studied in the presence of very large couplings. Therefore, it can be expected that radiative corrections to the potential are important. We study these effects at the example of the THDM type-II and find that loop corrections can revive more than 50% of the phenomenological viable points which are ruled out by the tree-level vacuum stability checks. Similar effects are expected for other extension of the standard model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L
Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less
NASA Astrophysics Data System (ADS)
Mazilu, Irina; Gonzalez, Joshua
2008-03-01
From the point of view of a physicist, a bio-molecular motor represents an interesting non-equilibrium system and it is directly amenable to an analysis using standard methods of non-equilibrium statistical physics. We conduct a rigorous Monte Carlo study of three different driven lattice gas models that retain the basic behavior of three types of cytoskeletal molecular motors. Our models incorporate novel features such as realistic dynamics rules and complex motor-motor interactions. We are interested to have a deeper understanding of how various parameters influence the macroscopic behavior of these systems, what is the density profile and if the system undergoes a phase transition. On the analytical front, we computed the steady-state probability distributions exactly for the one of the models using the matrix method that was established in 1993 by B. Derrida et al. We also explored the possibilities offered by the ``Bethe ansatz'' method by mapping some well studied spin models into asymmetric simple exclusion models (already analyzed using computer simulations), and to use the results obtained for the spin models in finding an exact solution for our problem. We have exhaustive computational studies of the kinesin and dynein molecular motor models that prove to be very useful in checking our analytical work.
Modeling of adipose/blood partition coefficient for environmental chemicals.
Papadaki, K C; Karakitsios, S P; Sarigiannis, D A
2017-12-01
A Quantitative Structure Activity Relationship (QSAR) model was developed in order to predict the adipose/blood partition coefficient of environmental chemical compounds. The first step of QSAR modeling was the collection of inputs. Input data included the experimental values of adipose/blood partition coefficient and two sets of molecular descriptors for 67 organic chemical compounds; a) the descriptors from Linear Free Energy Relationship (LFER) and b) the PaDEL descriptors. The datasets were split to training and prediction set and were analysed using two statistical methods; Genetic Algorithm based Multiple Linear Regression (GA-MLR) and Artificial Neural Networks (ANN). The models with LFER and PaDEL descriptors, coupled with ANN, produced satisfying performance results. The fitting performance (R 2 ) of the models, using LFER and PaDEL descriptors, was 0.94 and 0.96, respectively. The Applicability Domain (AD) of the models was assessed and then the models were applied to a large number of chemical compounds with unknown values of adipose/blood partition coefficient. In conclusion, the proposed models were checked for fitting, validity and applicability. It was demonstrated that they are stable, reliable and capable to predict the values of adipose/blood partition coefficient of "data poor" chemical compounds that fall within the applicability domain. Copyright © 2017. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Liu, W. T.; Tang, Wenqing; Wentz, Frank J.
1992-01-01
Global fields of precipitable water W from the special sensor microwave imager were compared with those from the European Center for Medium Range Weather Forecasts (ECMWF) model. They agree over most ocean areas; both data sets capture the two annual cycles examined and the interannual anomalies during an ENSO episode. They show significant differences in the dry air masses over the eastern tropical-subtropical oceans, particularly in the Southern Hemisphere. In these regions, comparisons with radiosonde data indicate that overestimation by the ECMWF model accounts for a large part of the differences. As a check on the W differences, surface-level specific humidity Q derived from W, using a statistical relation, was compared with Q from the ECMWF model. The differences in Q were found to be consistent with the differences in W, indirectly validating the Q-W relation. In both W and Q, SSMI was able to discern clearly the equatorial extension of the tongues of dry air in the eastern tropical ocean, while both ECMWF and climatological fields have reduced spatial gradients and weaker intensity.
10 CFR 35.2642 - Records of periodic spot-checks for teletherapy units.
Code of Federal Regulations, 2010 CFR
2010-01-01
....2642 Section 35.2642 Energy NUCLEAR REGULATORY COMMISSION MEDICAL USE OF BYPRODUCT MATERIAL Records... must include— (1) The date of the spot-check; (2) The manufacturer's name, model number, and serial... device; (6) The determined accuracy of each distance measuring and localization device; (7) The...
10 CFR 35.2642 - Records of periodic spot-checks for teletherapy units.
Code of Federal Regulations, 2011 CFR
2011-01-01
....2642 Section 35.2642 Energy NUCLEAR REGULATORY COMMISSION MEDICAL USE OF BYPRODUCT MATERIAL Records... must include— (1) The date of the spot-check; (2) The manufacturer's name, model number, and serial... device; (6) The determined accuracy of each distance measuring and localization device; (7) The...
12 CFR Appendix A to Part 205 - Model Disclosure Clauses and Forms
Code of Federal Regulations, 2010 CFR
2010-01-01
... your checking account using information from your check to: (i) Pay for purchases. (ii) Pay bills. (3... disclose information to third parties about your account or the transfers you make: (i) Where it is...) Disclosure by government agencies of information about obtaining account balances and account histories...
Using computer models to design gully erosion control structures for humid northern Ethiopia
USDA-ARS?s Scientific Manuscript database
Classic gully erosion control measures such as check dams have been unsuccessful in halting gully formation and growth in the humid northern Ethiopian highlands. Gullies are typically formed in vertisols and flow often bypasses the check dams as elevated groundwater tables make gully banks unstable....
Posterior Predictive Checks for Conditional Independence between Response Time and Accuracy
ERIC Educational Resources Information Center
Bolsinova, Maria; Tijmstra, Jesper
2016-01-01
Conditional independence (CI) between response time and response accuracy is a fundamental assumption of many joint models for time and accuracy used in educational measurement. In this study, posterior predictive checks (PPCs) are proposed for testing this assumption. These PPCs are based on three discrepancy measures reflecting different…
Building Program Verifiers from Compilers and Theorem Provers
2015-05-14
Checking with SMT UFO • LLVM-based front-end (partially reused in SeaHorn) • Combines Abstract Interpretation with Interpolation-Based Model Checking • (no...assertions Counter-examples are long Hard to determine (from main) what is relevant Assertion Main 35 Building Verifiers from Comp and SMT Gurfinkel, 2015
Tien, Han-Kuang; Chung, Wen
2018-05-10
This research addressed adults' health check-ups through the lens of Role Transportation Theory. This theory is applied to narrative advertising that lures adults into seeking health check-ups by causing audiences to empathize with the advertisement's character. This study explored the persuasive mechanism behind narrative advertising and reinforced the Protection Motivation Theory model. We added two key perturbation variables: optimistic bias and truth avoidance. To complete the verification hypothesis, we performed two experiments. In Experiment 1, we recruited 77 respondents online for testing. We used analyses of variance to verify the effectiveness of narrative and informative advertising. Then, in Experiment 2, we recruited 228 respondents to perform offline physical experiments and conducted a path analysis through structural equation modelling. The findings showed that narrative advertising positively impacted participants' disease prevention intentions. The use of Role Transportation Theory in advertising enables the audience to be emotionally connected with the character, which enhances persuasiveness. In Experiment 2, we found that the degree of role transference can interfere with optimistic bias, improve perceived health risk, and promote behavioral intentions for health check-ups. Furthermore, truth avoidance can interfere with perceived health risks, which, in turn, reduce behavioral intentions for health check-ups.
Augmentation of Teaching Tools: Outsourcing the HSD Computing for SPSS Application
ERIC Educational Resources Information Center
Wang, Jianjun
2010-01-01
The widely-used Tukey's HSD index is not produced in the current version of SPSS (i.e., PASW Statistics, version 18), and a computer program named "HSD Calculator" has been chosen to amend this problem. In comparison to hand calculation, this program application does not require table checking, which eliminates potential concern on the size of a…
-redshifted), Observed Flux, Statistical Error (Based on the optimal extraction algorithm of the IRAF packages were acquired using different instrumental settings for the blue and red parts of the spectrum to avoid extracted for systematics checks of the wavelength calibration. Wavelength and flux calibration were applied
ERIC Educational Resources Information Center
McGrath, April L.
2014-01-01
Office hours provide time outside of class for students to consult with instructors about course material, progress, and evaluation. Yet office hours, at times, remain an untapped source of academic support. The current study examined whether office hour attendance in combination with a learning reflection would help students learn material in an…
Collection of Medical Original Data with Search Engine for Decision Support.
Orthuber, Wolfgang
2016-01-01
Medicine is becoming more and more complex and humans can capture total medical knowledge only partially. For specific access a high resolution search engine is demonstrated, which allows besides conventional text search also search of precise quantitative data of medical findings, therapies and results. Users can define metric spaces ("Domain Spaces", DSs) with all searchable quantitative data ("Domain Vectors", DSs). An implementation of the search engine is online in http://numericsearch.com. In future medicine the doctor could make first a rough diagnosis and check which fine diagnostics (quantitative data) colleagues had collected in such a case. Then the doctor decides about fine diagnostics and results are sent (half automatically) to the search engine which filters a group of patients which best fits to these data. In this specific group variable therapies can be checked with associated therapeutic results, like in an individual scientific study for the current patient. The statistical (anonymous) results could be used for specific decision support. Reversely the therapeutic decision (in the best case with later results) could be used to enhance the collection of precise pseudonymous medical original data which is used for better and better statistical (anonymous) search results.
Quality Control of Meteorological Observations
NASA Technical Reports Server (NTRS)
Collins, William; Dee, Dick; Rukhovets, Leonid
1999-01-01
For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.
Digital tape unit test facility software
NASA Technical Reports Server (NTRS)
Jackson, J. T.
1971-01-01
Two computer programs are described which are used for the collection and analysis of data from the digital tape unit test facility (DTUTF). The data are the recorded results of skew tests made on magnetic digital tapes which are used on computers as input/output media. The results of each tape test are keypunched onto an 80 column computer card. The format of the card is checked and the card image is stored on a master summary tape via the DTUTF card checking and tape updating system. The master summary tape containing the results of all the tape tests is then used for analysis as input to the DTUTF histogram generating system which produces a histogram of skew vs. date for selected data, followed by some statistical analysis of the data.
[Instruments for quantitative methods of nursing research].
Vellone, E
2000-01-01
Instruments for quantitative nursing research are a mean to objectify and measure a variable or a phenomenon in the scientific research. There are direct instruments to measure concrete variables and indirect instruments to measure abstract concepts (Burns, Grove, 1997). Indirect instruments measure the attributes by which a concept is made of. Furthermore, there are instruments for physiologic variables (e.g. for the weight), observational instruments (Check-lists e Rating Scales), interviews, questionnaires, diaries and the scales (Check-lists, Rating Scales, Likert Scales, Semantic Differential Scales e Visual Anologue Scales). The choice to select an instrument or another one depends on the research question and design. Instruments research are very useful in research both to describe the variables and to see statistical significant relationships. Very carefully should be their use in the clinical practice for diagnostic assessment.
Application of real rock pore-threat statistics to a regular pore network model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rakibul, M.; Sarker, H.; McIntyre, D.
2011-01-01
This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data.« less
Application of real rock pore-throat statistics to a regular pore network model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarker, M.R.; McIntyre, D.; Ferer, M.
2011-01-01
This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data. Introduction« less
NASA Astrophysics Data System (ADS)
Zhang, Feng-Liang; Ni, Yan-Chun; Au, Siu-Kui; Lam, Heung-Fai
2016-03-01
The identification of modal properties from field testing of civil engineering structures is becoming economically viable, thanks to the advent of modern sensor and data acquisition technology. Its demand is driven by innovative structural designs and increased performance requirements of dynamic-prone structures that call for a close cross-checking or monitoring of their dynamic properties and responses. Existing instrumentation capabilities and modal identification techniques allow structures to be tested under free vibration, forced vibration (known input) or ambient vibration (unknown broadband loading). These tests can be considered complementary rather than competing as they are based on different modeling assumptions in the identification model and have different implications on costs and benefits. Uncertainty arises naturally in the dynamic testing of structures due to measurement noise, sensor alignment error, modeling error, etc. This is especially relevant in field vibration tests because the test condition in the field environment can hardly be controlled. In this work, a Bayesian statistical approach is developed for modal identification using the free vibration response of structures. A frequency domain formulation is proposed that makes statistical inference based on the Fast Fourier Transform (FFT) of the data in a selected frequency band. This significantly simplifies the identification model because only the modes dominating the frequency band need to be included. It also legitimately ignores the information in the excluded frequency bands that are either irrelevant or difficult to model, thereby significantly reducing modeling error risk. The posterior probability density function (PDF) of the modal parameters is derived rigorously from modeling assumptions and Bayesian probability logic. Computational difficulties associated with calculating the posterior statistics, including the most probable value (MPV) and the posterior covariance matrix, are addressed. Fast computational algorithms for determining the MPV are proposed so that the method can be practically implemented. In the companion paper (Part II), analytical formulae are derived for the posterior covariance matrix so that it can be evaluated without resorting to finite difference method. The proposed method is verified using synthetic data. It is also applied to modal identification of full-scale field structures.
Development of an Advanced Respirator Fit-Test Headform
Bergman, Michael S.; Zhuang, Ziqing; Hanson, David; Heimbuch, Brian K.; McDonald, Michael J.; Palmiero, Andrew J.; Shaffer, Ronald E.; Harnish, Delbert; Husband, Michael; Wander, Joseph D.
2015-01-01
Improved respirator test headforms are needed to measure the fit of N95 filtering facepiece respirators (FFRs) for protection studies against viable airborne particles. A Static (i.e., non-moving, non-speaking) Advanced Headform (StAH) was developed for evaluating the fit of N95 FFRs. The StAH was developed based on the anthropometric dimensions of a digital headform reported by the National Institute for Occupational Safety and Health (NIOSH) and has a silicone polymer skin with defined local tissue thicknesses. Quantitative fit factor evaluations were performed on seven N95 FFR models of various sizes and designs. Donnings were performed with and without a pre-test leak checking method. For each method, four replicate FFR samples of each of the seven models were tested with two donnings per replicate, resulting in a total of 56 tests per donning method. Each fit factor evaluation was comprised of three 86-sec exercises: “Normal Breathing” (NB, 11.2 liters per min (lpm)), “Deep Breathing” (DB, 20.4 lpm), then NB again. A fit factor for each exercise and an overall test fit factor were obtained. Analysis of variance methods were used to identify statistical differences among fit factors (analyzed as logarithms) for different FFR models, exercises, and testing methods. For each FFR model and for each testing method, the NB and DB fit factor data were not significantly different (P > 0.05). Significant differences were seen in the overall exercise fit factor data for the two donning methods among all FFR models (pooled data) and in the overall exercise fit factor data for the two testing methods within certain models. Utilization of the leak checking method improved the rate of obtaining overall exercise fit factors ≥100. The FFR models, which are expected to achieve overall fit factors ≥ 100 on human subjects, achieved overall exercise fit factors ≥ 100 on the StAH. Further research is needed to evaluate the correlation of FFRs fitted on the StAH to FFRs fitted on people. PMID:24369934
Development of a Sigma-2 Receptor affinity filter through a Monte Carlo based QSAR analysis.
Rescifina, Antonio; Floresta, Giuseppe; Marrazzo, Agostino; Parenti, Carmela; Prezzavento, Orazio; Nastasi, Giovanni; Dichiara, Maria; Amata, Emanuele
2017-08-30
For the first time in sigma-2 (σ 2 ) receptor field, a quantitative structure-activity relationship (QSAR) model has been built using pK i values of the whole set of known selective σ 2 receptor ligands (548 compounds), taken from the Sigma-2 Receptor Selective Ligands Database (S2RSLDB) (http://www.researchdsf.unict.it/S2RSLDB/), through the Monte Carlo technique and employing the software CORAL. The model has been developed by using a large and structurally diverse set of compounds, allowing for a prediction of different populations of chemical compounds endpoint (σ 2 receptor pK i ). The statistical quality reached, suggested that model for pK i determination is robust and possesses a satisfactory predictive potential. The statistical quality is high for both visible and invisible sets. The screening of the FDA approved drugs, external to our dataset, suggested that sixteen compounds might be repositioned as σ 2 receptor ligands (predicted pK i ≥8). A literature check showed that six of these compounds have already been tested for affinity at σ 2 receptor and, of these, two (Flunarizine and Terbinafine) have shown an experimental σ 2 receptor pK i >7. This suggests that this QSAR model may be used as focusing screening filter in order to prospectively find or repurpose new drugs with high affinity for the σ 2 receptor, and overall allowing for an enhanced hit rate respect to a random screening. Copyright © 2017 Elsevier B.V. All rights reserved.
Spatiotemporal Variation in Distance Dependent Animal Movement Contacts: One Size Doesn’t Fit All
Brommesson, Peter; Wennergren, Uno; Lindström, Tom
2016-01-01
The structure of contacts that mediate transmission has a pronounced effect on the outbreak dynamics of infectious disease and simulation models are powerful tools to inform policy decisions. Most simulation models of livestock disease spread rely to some degree on predictions of animal movement between holdings. Typically, movements are more common between nearby farms than between those located far away from each other. Here, we assessed spatiotemporal variation in such distance dependence of animal movement contacts from an epidemiological perspective. We evaluated and compared nine statistical models, applied to Swedish movement data from 2008. The models differed in at what level (if at all), they accounted for regional and/or seasonal heterogeneities in the distance dependence of the contacts. Using a kernel approach to describe how probability of contacts between farms changes with distance, we developed a hierarchical Bayesian framework and estimated parameters by using Markov Chain Monte Carlo techniques. We evaluated models by three different approaches of model selection. First, we used Deviance Information Criterion to evaluate their performance relative to each other. Secondly, we estimated the log predictive posterior distribution, this was also used to evaluate their relative performance. Thirdly, we performed posterior predictive checks by simulating movements with each of the parameterized models and evaluated their ability to recapture relevant summary statistics. Independent of selection criteria, we found that accounting for regional heterogeneity improved model accuracy. We also found that accounting for seasonal heterogeneity was beneficial, in terms of model accuracy, according to two of three methods used for model selection. Our results have important implications for livestock disease spread models where movement is an important risk factor for between farm transmission. We argue that modelers should refrain from using methods to simulate animal movements that assume the same pattern across all regions and seasons without explicitly testing for spatiotemporal variation. PMID:27760155
The theory precision analyse of RFM localization of satellite remote sensing imagery
NASA Astrophysics Data System (ADS)
Zhang, Jianqing; Xv, Biao
2009-11-01
The tradition method of detecting precision of Rational Function Model(RFM) is to make use of a great deal check points, and it calculates mean square error through comparing calculational coordinate with known coordinate. This method is from theory of probability, through a large number of samples to statistic estimate value of mean square error, we can think its estimate value approaches in its true when samples are well enough. This paper is from angle of survey adjustment, take law of propagation of error as the theory basis, and it calculates theory precision of RFM localization. Then take the SPOT5 three array imagery as experiment data, and the result of traditional method and narrated method in the paper are compared, while has confirmed tradition method feasible, and answered its theory precision question from the angle of survey adjustment.
REACT: Resettable Hold Down and Release Actuator for Space Applications
NASA Astrophysics Data System (ADS)
Nava, Nestor; Collado, Marcelo; Cabás, Ramiro
2014-07-01
A new HDRA based on SMA technology, called REACT, has been designed for development of loads and appendixes in space applications. This design involves a rod supported by spheres that block its axial movement during a preload application. The rod shape allows misalignment and blocks the rotation around axial axis for a proper installation of the device. Because of the high preload requirements for this type of actuators, finite element analysis (FEA) has been developed in order to check the structure resistance. The results of the FEA have constrained the REACT design, in terms of dimensions, materials, and shape of the mechanical parts. A complete test campaign for qualification of REACT is proposed. Several qualification models are intended to be built for testing in parallel. Therefore, it is a way to demonstrate margins which allows getting some statistics.
A rule-based approach to model checking of UML state machines
NASA Astrophysics Data System (ADS)
Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz
2016-12-01
In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.
Introduction of Virtualization Technology to Multi-Process Model Checking
NASA Technical Reports Server (NTRS)
Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu
2009-01-01
Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.
Solomon-Krakus, Shauna; Sabiston, Catherine M
2017-12-01
This study examined whether body checking was a correlate of weight- and body-related shame and guilt for men and women. Participants were 537 adults (386 women) between the ages of 17 and 74 (M age =28.29, SD=14.63). Preliminary analyses showed women reported significantly more body-checking (p<.001), weight- and body-related shame (p<.001), and weight- and body-related guilt (p<.001) than men. In sex-stratified hierarchical linear regression models, body checking was significantly and positively associated with weight- and body-related shame (R 2 =.29 and .43, p<.001) and weight- and body-related guilt (R 2 =.34 and .45, p<.001) for men and women, respectively. Based on these findings, body checking is associated with negative weight- and body-related self-conscious emotions. Intervention and prevention efforts aimed at reducing negative weight- and body-related self-conscious emotions should consider focusing on body checking for adult men and women. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cibulka, Nancy J; Forney, Sandra; Goodwin, Kathy; Lazaroff, Patricia; Sarabia, Rebecca
2011-05-01
To test the effectiveness of an advanced practice nurse model of care to improve oral health in low-income pregnant women. Pregnant women (n=170) were randomized to either control or experimental group in a hospital-based inner city clinic in the Midwest. Participants completed pretest and posttest questionnaires. Those in the experimental group participated in an educational session, received dental supplies, and were scheduled for an oral care appointment. Descriptive statistics, paired samples t-tests, and Pearson's chi-square test were used to analyze the data. Knowledge scores showed a small positive trend while favorable self-perception of oral health increased significantly in the experimental group. The experimental group demonstrated a significant increase in frequency of brushing and flossing teeth, marked reduction in intake of high sugar drinks, and reported more than twice as many visits for a dental check-up than the control group. Significant barriers to obtaining oral health services were identified. Because adverse pregnancy outcomes have been linked to periodontitis in numerous research studies, pregnant women must be educated about the importance of oral health and the necessity of a check-up. APNs are in an ideal position to educate women and assist them to obtain necessary oral health services. ©2011 The Author(s) Journal compilation ©2011 American Academy of Nurse Practitioners.
The impact of an immunization check-up at a pharmacist-provided employee health screening.
Sparkman, Amy; Brookhart, Andrea L; Goode, Jean-Venable Kelly R
To determine which types of vaccine recommendations were accepted and acted upon by patients after an immunization check-up at a pharmacist-provided employee health screening, and to evaluate if there was a difference between influenza and non-influenza vaccines. Retrospective, observational. Supermarket chain. Employees and covered spouses. Immunization check-up. Acceptance rate of immunization recommendation. This retrospective observational study evaluated the impact of an immunization check-up in individuals who participated in one of the 252 pharmacist-provided health screenings in central Virginia in 2015. All employee health screenings were completed from July 1, 2015, to September 30, 2015. Because immunization status was assessed 6 months after each person received his or her health screening, data were collected from January 1, 2016, to March 30, 2016, and analyzed to collect the number and type of vaccines recommended during the immunization check-up. Each eligible participant's profile was evaluated to determine if he or she received the vaccines at any Kroger pharmacy within 6 months. Patient identifiers were not collected; however, demographics including age, relevant disease state history, and smoking status were collected with immunization recommendations and uptake. Data were analyzed with the use of descriptive statistics. A total of 349 immunization recommendations were made, including 248 influenza; 42 pneumococcal polysaccharide (PPSV23); 40 tetanus, diphtheria, and pertussis (Tdap); 12 herpes zoster; 4 pneumococcal conjugate (PCV13); and 3 hepatitis B. Both influenza and PCV13 had acceptance rates of 50%, and herpes zoster, Tdap, hepatitis B, and PPSV23 had 42%, 35%, 33%, and 24% acceptance rates, respectively. Influenza recommendations had a 50% acceptance rate compared with a 32% acceptance rate of non-influenza recommendations (P = 0.002). An immunization check-up performed at a pharmacist-provided employee health screening can lead to patient acceptance of recommendations and receipt of needed immunizations. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Faria, J. M.; Mahomad, S.; Silva, N.
2009-05-01
The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.
Reeves, J; Reynolds, S; Coker, S; Wilson, C
2010-09-01
The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9-12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children's self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children's perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children's self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children. (c) 2010 Elsevier Ltd. All rights reserved.
75 FR 52482 - Airworthiness Directives; PILATUS Aircraft Ltd. Model PC-7 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-26
..., check the airplane maintenance records to determine if the left and/or right aileron outboard bearing... an entry is found during the airplane maintenance records check required in paragraph (f)(1) of this...-0849; Directorate Identifier 2010-CE-043-AD] RIN 2120-AA64 Airworthiness Directives; PILATUS Aircraft...
77 FR 50644 - Airworthiness Directives; Cessna Airplane Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... airplanes that have P/N 1134104-1 or 1134104-5 A/C compressor motor installed; an aircraft logbook check for... following: (1) Inspect the number of hours on the A/C compressor hour meter; and (2) Check the aircraft.... Do the replacement following Cessna Aircraft Company Model 525 Maintenance Manual, Revision 23, dated...
CCM-C,Collins checks the middeck experiment
1999-07-24
S93-E-5016 (23 July 1999) --- Astronaut Eileen M. Collins, mission commander, checks on an experiment on Columbia's middeck during Flight Day 1 activity. The experiment is called the Cell Culture Model, Configuration C. Objectives of it are to validate cell culture models for muscle, bone and endothelial cell biochemical and functional loss induced by microgravity stress; to evaluate cytoskeleton, metabolism, membrane integrity and protease activity in target cells; and to test tissue loss pharmaceuticals for efficacy. The photo was recorded with an electronic still camera (ESC).
Merchán, D; Auqué, L F; Acero, P; Gimeno, M J; Causapé, J
2015-01-01
Salinization of water bodies represents a significant risk in water systems. The salinization of waters in a small irrigated hydrological basin is studied herein through an integrated hydrogeochemical study including multivariate statistical analyses and geochemical modeling. The study zone has two well differentiated geologic materials: (i) Quaternary sediments of low salinity and high permeability and (ii) Tertiary sediments of high salinity and very low permeability. In this work, soil samples were collected and leaching experiments conducted on them in the laboratory. In addition, water samples were collected from precipitation, irrigation, groundwater, spring and surface waters. The waters show an increase in salinity from precipitation and irrigation water to ground- and, finally, surface water. The enrichment in salinity is related to the dissolution of soluble mineral present mainly in the Tertiary materials. Cation exchange, precipitation of calcite and, probably, incongruent dissolution of dolomite, have been inferred from the hydrochemical data set. Multivariate statistical analysis provided information about the structure of the data, differentiating the group of surface waters from the groundwaters and the salinization from the nitrate pollution processes. The available information was included in geochemical models in which hypothesis of consistency and thermodynamic feasibility were checked. The assessment of the collected information pointed to a natural control on salinization processes in the Lerma Basin with minimal influence of anthropogenic factors. Copyright © 2014 Elsevier B.V. All rights reserved.
The (mis)reporting of statistical results in psychology journals.
Bakker, Marjan; Wicherts, Jelte M
2011-09-01
In order to study the prevalence, nature (direction), and causes of reporting errors in psychology, we checked the consistency of reported test statistics, degrees of freedom, and p values in a random sample of high- and low-impact psychology journals. In a second study, we established the generality of reporting errors in a random sample of recent psychological articles. Our results, on the basis of 281 articles, indicate that around 18% of statistical results in the psychological literature are incorrectly reported. Inconsistencies were more common in low-impact journals than in high-impact journals. Moreover, around 15% of the articles contained at least one statistical conclusion that proved, upon recalculation, to be incorrect; that is, recalculation rendered the previously significant result insignificant, or vice versa. These errors were often in line with researchers' expectations. We classified the most common errors and contacted authors to shed light on the origins of the errors.
Class Model Development Using Business Rules
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Gudas, Saulius
New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.
Kaiser, Jan Christian; Meckbach, Reinhard; Eidemüller, Markus; Selmansberger, Martin; Unger, Kristian; Shpak, Viktor; Blettner, Maria; Zitzelsberger, Horst; Jacob, Peter
2016-12-01
Strong evidence for the statistical association between radiation exposure and disease has been produced for thyroid cancer by epidemiological studies after the Chernobyl accident. However, limitations of the epidemiological approach in order to explore health risks especially at low doses of radiation appear obvious. Statistical fluctuations due to small case numbers dominate the uncertainty of risk estimates. Molecular radiation markers have been searched extensively to separate radiation-induced cancer cases from sporadic cases. The overexpression of the CLIP2 gene is the most promising of these markers. It was found in the majority of papillary thyroid cancers (PTCs) from young patients included in the Chernobyl tissue bank. Motivated by the CLIP2 findings we propose a mechanistic model which describes PTC development as a sequence of rate-limiting events in two distinct paths of CLIP2-associated and multistage carcinogenesis. It integrates molecular measurements of the dichotomous CLIP2 marker from 141 patients into the epidemiological risk analysis for about 13 000 subjects from the Ukrainian-American cohort which were exposed below age 19 years and were put under enhanced medical surveillance since 1998. For the first time, a radiation risk has been estimated solely from marker measurements. Cross checking with epidemiological estimates and model validation suggests that CLIP2 is a marker of high precision. CLIP2 leaves an imprint in the epidemiological incidence data which is typical for a driver gene. With the mechanistic model, we explore the impact of radiation on the molecular landscape of PTC. The model constitutes a unique interface between molecular biology and radiation epidemiology. © The Author 2016. Published by Oxford University Press.
Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations
NASA Astrophysics Data System (ADS)
Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.
2018-07-01
Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalized waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalized model for extreme-mass-ratio inspirals constructed on deformed black hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.
Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations
NASA Astrophysics Data System (ADS)
Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.
2018-04-01
Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black-hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalised waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalised model for extreme-mass-ratio inspirals constructed on deformed black-hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.
Piha, Kustaa; Sumanen, Hilla; Lahelma, Eero; Rahkonen, Ossi
2017-04-01
There is contradictory evidence on the association between health check-ups and future morbidity. Among the general population, those with high socioeconomic position participate more often in health check-ups. The main aims of this study were to analyse if attendance to health check-ups are socioeconomically patterned and affect sickness absence over a 10-year follow-up. This register-based follow-up study included municipal employees of the City of Helsinki. 13 037 employees were invited to age-based health check-up during 2000-2002, with a 62% attendance rate. Education, occupational class and individual income were used to measure socioeconomic position. Medically certified sickness absence of 4 days or more was measured and controlled for at the baseline and used as an outcome over follow-up. The mean follow-up time was 7.5 years. Poisson regression was used. Men and employees with lower socioeconomic position participated more actively in health check-ups. Among women, non-attendance to health check-up predicted higher sickness absence during follow-up (relative risk =1.26, 95% CI 1.17 to 1.37) in the fully adjusted model. Health check-ups were not effective in reducing socioeconomic differences in sickness absence. Age-based health check-ups reduced subsequent sickness absence and should be promoted. Attendance to health check-ups should be as high as possible. Contextual factors need to be taken into account when applying the results in interventions in other settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Wolgin, M; Grabowski, S; Elhadad, S; Frank, W; Kielbassa, A M
2018-03-25
This study aimed to evaluate the educational outcome of a digitally based self-assessment concept (prepCheck; DentsplySirona, Wals, Austria) for pre-clinical undergraduates in the context of a regular phantom-laboratory course. A sample of 47 third-year dental students participated in the course. Students were randomly divided into a prepCheck-supervised (self-assessment) intervention group (IG; n = 24); conventionally supervised students constituted the control group (CG; n = 23). During the preparation of three-surface (MOD) class II amalgam cavities, each IG participant could analyse a superimposed 3D image of his/her preparation against the "master preparation" using the prepCheck software. In the CG, several course instructors performed the evaluations according to pre-defined assessment criteria. After completing the course, a mandatory (blinded) practical examination was taken by all course participants (both IG and CG students), and this assessment involved the preparation of a MOD amalgam cavity. Then, optical impressions by means of a CEREC-Omnicam were taken to digitalize all examination preparations, followed by surveying and assessing the latter using prepCheck. The statistical analysis of the digitalized samples (Mann-Whitney U test) revealed no significant differences between the cavity dimensions achieved in the IG and CG (P = .406). Additionally, the sum score of the degree of conformity with the "master preparation" (maximum permissible 10% of plus or minus deviation) was comparable in both groups (P = .259). The implemented interactive digitally based, self-assessment learning tool for undergraduates appears to be equivalent to the conventional form of supervision. Therefore, such digital learning tools could significantly address the ever-increasing student to faculty ratio. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soh, Keng Chuan; Tay, Kiang Hiong, E-mail: tay.kiang.hiong@sgh.com.sg; Tan, Bien Soo
2008-05-15
Our aim was to review our experience with percutaneous antegrade ureteric stent (PAUS) placement and to determine if the routinely conducted check nephrostogram on the day following ureteric stent placement was necessary. Retrospective review of patients who had undergone PAUS placement between January 2004 and December 2005 was performed. There were 83 subjects (36 males, 47 females), with a mean age of 59.9 years (range, 22-94 years). Average follow-up duration was 7.1 months (range, 1-24 months). The most common indications for PAUS placement were ureteric obstruction due to metastatic disease (n = 56) and urinary calculi (n = 34). Technicalmore » success was 93.2% (96/103 attempts), with no major immediate procedure-related complications or mortalities. The Bard 7Fr Urosoft DJ Stent was used in more than 95% of the cases. Eighty-one of 89 (91.0%) check nephrostograms demonstrated a patent ureteric stent with resultant safety catheter removal. Three check nephrostograms revealed distal stent migration requiring repositioning by a goose-snare, while five others showed stent occlusion necessitating permanent external drainage by nephrostomy drainage catheter reinsertion. Following PAUS placement, the serum creatinine level improved or stabilized in 82% of patients. The serum creatinine outcome difference between the groups with benign and malignant indications for PAUS placement was not statistically significant (p = 0.145) but resolution of hydronephrosis was significantly better (p = 0.008) in patients with benign indications. Percutaneous antegrade ureteric stent placement is a safe and effective means of relief for ureteric obstruction. The check nephrostogram following ureteric stent placement was unnecessary in the majority of patients.« less
Litho hotspots fixing using model based algorithm
NASA Astrophysics Data System (ADS)
Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan
2017-04-01
As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.
Model-Checking with Edge-Valued Decision Diagrams
NASA Technical Reports Server (NTRS)
Roux, Pierre; Siminiceanu, Radu I.
2010-01-01
We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Tools for Basic Statistical Analysis
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.
Computation of statistical secondary structure of nucleic acids.
Yamamoto, K; Kitamura, Y; Yoshikura, H
1984-01-01
This paper presents a computer analysis of statistical secondary structure of nucleic acids. For a given single stranded nucleic acid, we generated "structure map" which included all the annealing structures in the sequence. The map was transformed into "energy map" by rough approximation; here, the energy level of every pairing structure consisting of more than 2 successive nucleic acid pairs was calculated. By using the "energy map", the probability of occurrence of each annealed structure was computed, i.e., the structure was computed statistically. The basis of computation was the 8-queen problem in the chess game. The validity of our computer programme was checked by computing tRNA structure which has been well established. Successful application of this programme to small nuclear RNAs of various origins is demonstrated. PMID:6198622
1988-12-19
Statistics [CEI Database 4 Nov] 17 Construction Bank Checks on Investment Loans [XINHUA] 17 Gold Output Rising 10 Percent Annually [CHINA DAILY 8...Industrial Output for September [CEI Database 11 Nov] 23 Energy Industry Grows Steadily in 1988 [CEI Database 11 Nov] 23 Government Plans To Boost...Plastics Industry [XINHUA] 24 Chongqing’s Industrial Output Increases [XINHUA] 24 Haikou Boosts Power Industry [CEI Database 27 Oct] 24 Jilin
Antipersistent Markov behavior in foreign exchange markets
NASA Astrophysics Data System (ADS)
Baviera, Roberto; Pasquini, Michele; Serva, Maurizio; Vergni, Davide; Vulpiani, Angelo
2002-09-01
A quantitative check of efficiency in US dollar/Deutsche mark exchange rates is developed using high-frequency (tick by tick) data. The antipersistent Markov behavior of log-price fluctuations of given size implies, in principle, the possibility of a statistical forecast. We introduce and measure the available information of the quote sequence, and we show how it can be profitable following a particular trading rule.
The Solid Rocket Motor Slag Population: Results of a Radar-based Regressive Statistical Evaluation
NASA Technical Reports Server (NTRS)
Horstman, Matthew F.; Xu, Yu-Lin
2008-01-01
Solid rocket motor (SRM) slag has been identified as a significant source of man-made orbital debris. The propensity of SRMs to generate particles of 100 m and larger has caused concern regarding their contribution to the debris environment. Radar observation, rather than in-situ gathered evidence, is currently the only measurable source for the NASA/ODPO model of the on-orbit slag population. This simulated model includes the time evolution of the resultant orbital populations using a historical database of SRM launches, propellant masses, and estimated locations and times of tail-off. However, due to the small amount of observational evidence, there can be no direct comparison to check the validity of this model. Rather than using the assumed population developed from purely historical and physical assumptions, a regressional approach was used which utilized the populations observed by the Haystack radar from 1996 to present. The estimated trajectories from the historical model of slag sources, and the corresponding plausible detections by the Haystack radar, were identified. Comparisons with observational data from the ensuing years were made, and the SRM model was altered with respect to size and mass production of slag particles to reflect the historical data obtained. The result is a model SRM population that fits within the bounds of the observed environment.
Quality of reporting statistics in two Indian pharmacology journals
Jaykaran; Yadav, Preeti
2011-01-01
Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals’ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Results: Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7–83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of “mean (SD)” or “mean ± SD.” Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6–38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Conclusion: Articles published in two Indian pharmacology journals are not devoid of statistical errors. PMID:21772766
van der Heijden, Amy; Mulder, Bob C; Poortvliet, P Marijn; van Vliet, Arnold J H
2017-11-25
Performing a tick check after visiting nature is considered the most important preventive measure to avoid contracting Lyme disease. Checking the body for ticks after visiting nature is the only measure that can fully guarantee whether one has been bitten by a tick and provides the opportunity to remove the tick as soon as possible, thereby greatly reducing the chance of contracting Lyme disease. However, compliance to performing the tick check is low. In addition, most previous studies on determinants of preventive measures to avoid Lyme disease lack a clear definition and/or operationalization of the term "preventive measures". Those that do distinguish multiple behaviors including the tick check, fail to describe the systematic steps that should be followed in order to perform the tick check effectively. Hence, the purpose of this study was to identify determinants of systematically performing the tick check, based on social cognitive theory. A cross-sectional self-administered survey questionnaire was filled out online by 508 respondents (M age = 51.7, SD = 16.0; 50.2% men; 86.4% daily or weekly nature visitors). Bivariate correlations and multivariate regression analyses were conducted to identify associations between socio-cognitive determinants (i.e. concepts related to humans' intrinsic and extrinsic motivation to perform certain behavior), and the tick check, and between socio-cognitive determinants and proximal goal to do the tick check. The full regression model explained 28% of the variance in doing the tick check. Results showed that performing the tick check was associated with proximal goal (β = .23, p < 0.01), self-efficacy (β = .22, p < 0.01), self-evaluative outcome expectations (β = .21, p < 0.01), descriptive norm (β = .16, p < 0.01), and experience (β = .13, p < 0.01). Our study is among the first to examine the determinants of systematic performance of the tick check, using an extended version of social cognitive theory to identify determinants. Based on the results, a number of practical recommendations can be made to promote the performance of the tick check.
TU-FG-201-05: Varian MPC as a Statistical Process Control Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carver, A; Rowbottom, C
Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less
Prediction Interval Development for Wind-Tunnel Balance Check-Loading
NASA Technical Reports Server (NTRS)
Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.
2014-01-01
Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.
NASA Technical Reports Server (NTRS)
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Gupta, Shikha; Basant, Nikita; Rai, Premanjali; Singh, Kunwar P
2015-11-01
Binding affinity of chemical to carbon is an important characteristic as it finds vast industrial applications. Experimental determination of the adsorption capacity of diverse chemicals onto carbon is both time and resource intensive, and development of computational approaches has widely been advocated. In this study, artificial intelligence (AI)-based ten different qualitative and quantitative structure-property relationship (QSPR) models (MLPN, RBFN, PNN/GRNN, CCN, SVM, GEP, GMDH, SDT, DTF, DTB) were established for the prediction of the adsorption capacity of structurally diverse chemicals to activated carbon following the OECD guidelines. Structural diversity of the chemicals and nonlinear dependence in the data were evaluated using the Tanimoto similarity index and Brock-Dechert-Scheinkman statistics. The generalization and prediction abilities of the constructed models were established through rigorous internal and external validation procedures performed employing a wide series of statistical checks. In complete dataset, the qualitative models rendered classification accuracies between 97.04 and 99.93%, while the quantitative models yielded correlation (R(2)) values of 0.877-0.977 between the measured and the predicted endpoint values. The quantitative prediction accuracies for the higher molecular weight (MW) compounds (class 4) were relatively better than those for the low MW compounds. Both in the qualitative and quantitative models, the Polarizability was the most influential descriptor. Structural alerts responsible for the extreme adsorption behavior of the compounds were identified. Higher number of carbon and presence of higher halogens in a molecule rendered higher binding affinity. Proposed QSPR models performed well and outperformed the previous reports. A relatively better performance of the ensemble learning models (DTF, DTB) may be attributed to the strengths of the bagging and boosting algorithms which enhance the predictive accuracies. The proposed AI models can be useful tools in screening the chemicals for their binding affinities toward carbon for their safe management.
75 FR 43801 - Airworthiness Directives; Eurocopter France (ECF) Model EC225LP Helicopters
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-27
... time. Also, we use inspect rather than check when referring to an action required by a mechanic as... the various levels of government. Therefore, I certify this AD: 1. Is not a ``significant regulatory... compliance time. Also, we use inspect rather than check when referring to an action required by a mechanic as...
Mandatory Identification Bar Checks: How Bouncers Are Doing Their Job
ERIC Educational Resources Information Center
Monk-Turner, Elizabeth; Allen, John; Casten, John; Cowling, Catherine; Gray, Charles; Guhr, David; Hoofnagle, Kara; Huffman, Jessica; Mina, Moises; Moore, Brian
2011-01-01
The behavior of bouncers at on site establishments that served alcohol was observed. Our aim was to better understand how bouncers went about their job when the bar had a mandatory policy to check identification of all customers. Utilizing an ethnographic decision model, we found that bouncers were significantly more likely to card customers that…
ERIC Educational Resources Information Center
Kleinert, Whitney L.; Silva, Meghan R.; Codding, Robin S.; Feinberg, Adam B.; St. James, Paula S.
2017-01-01
Classroom management is essential to promote learning in schools, and as such it is imperative that teachers receive adequate support to maximize their competence implementing effective classroom management strategies. One way to improve teachers' classroom managerial competence is through consultation. The Classroom Check-Up (CCU) is a structured…
76 FR 18964 - Airworthiness Directives; Costruzioni Aeronautiche Tecnam srl Model P2006T Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-06
... Landing Gear retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on... condition for the specified products. The MCAI states: During Landing Gear retraction/extension ground... retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on the nose landing...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... to require recurring checks of the Blade Inspection Method (BIM) indicator on each blade to determine whether the BIM indicator is signifying that the blade pressure may have been compromised by a blade crack... check procedures for BIM blades installed on the Model S-64E and S-64F helicopters. Several blade spars...
ERIC Educational Resources Information Center
Reinke, Wendy M.; Herman, Keith C.; Sprick, Randy
2011-01-01
Highly accessible and user-friendly, this book focuses on helping K-12 teachers increase their use of classroom management strategies that work. It addresses motivational aspects of teacher consultation that are essential, yet often overlooked. The Classroom Check-Up is a step-by-step model for assessing teachers' organizational, instructional,…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
... the fitting and wing structure. Checking the nuts with a suitable torque spanner to the specifications in the torque figures shown in Table 2. of the Accomplishment Instructions of BAE SYSTEMS (OPERATIONS... installed, and Doing either an ultrasonic inspection for damaged bolts or torque check of the tension bolts...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
..., an operator found an aileron trim tab hinge pin that had migrated sufficiently to cause a rubbing.... Recently, during a walk round check, an operator found an aileron trim tab hinge pin that had migrated... walk round check, an operator found an aileron trim tab hinge pin that had migrated sufficiently to...
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2007-01-01
This report presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV) [SMV]. The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space. Also, additional innovative state space reduction techniques are introduced that can be used in future verification efforts applied to this and other protocols.
Stochastic Local Search for Core Membership Checking in Hedonic Games
NASA Astrophysics Data System (ADS)
Keinänen, Helena
Hedonic games have emerged as an important tool in economics and show promise as a useful formalism to model multi-agent coalition formation in AI as well as group formation in social networks. We consider a coNP-complete problem of core membership checking in hedonic coalition formation games. No previous algorithms to tackle the problem have been presented. In this work, we overcome this by developing two stochastic local search algorithms for core membership checking in hedonic games. We demonstrate the usefulness of the algorithms by showing experimentally that they find solutions efficiently, particularly for large agent societies.
LACIE performance predictor final operational capability program description, volume 1
NASA Technical Reports Server (NTRS)
1976-01-01
The program EPHEMS computes the orbital parameters for up to two vehicles orbiting the earth for up to 549 days. The data represents a continuous swath about the earth, producing tables which can be used to determine when and if certain land segments will be covered. The program GRID processes NASA's climatology tape to obtain the weather indices along with associated latitudes and longitudes. The program LUMP takes substrata historical data and sample segment ID, crop window, crop window error and statistical data, checks for valid input parameters and generates the segment ID file, crop window file and the substrata historical file. Finally, the System Error Executive (SEE) Program checks YES error and truth data, CAMS error data, and signature extension data for validity and missing elements. A message is printed for each error found.
Nilchian, Firoozeh; Shakibaei, Fereshteh; Jarah, Zeinab Taghi
2017-03-01
This study was aimed to evaluate the impact of visual pedagogy in dental check-ups and preventive practices among children with autism aged 6-12. In this randomized double-blind clinical trial, the cooperation of 40 children with autism age 6-12. The selected children were equally divided into two groups of case and control (n = 20). The obtained data were analyzed by statistical tests, including Chi square and independent t test. The results of Cochran showed a significant increase in children's cooperation with regard to fluoride therapy in the case group by repeating the visit and training sessions (p ≤ 0.001). The findings of this study demonstrated, visual pedagogy was merely effective in the case of fluoride therapy in the case group.
NASA Astrophysics Data System (ADS)
Song, Yunquan; Lin, Lu; Jian, Ling
2016-07-01
Single-index varying-coefficient model is an important mathematical modeling method to model nonlinear phenomena in science and engineering. In this paper, we develop a variable selection method for high-dimensional single-index varying-coefficient models using a shrinkage idea. The proposed procedure can simultaneously select significant nonparametric components and parametric components. Under defined regularity conditions, with appropriate selection of tuning parameters, the consistency of the variable selection procedure and the oracle property of the estimators are established. Moreover, due to the robustness of the check loss function to outliers in the finite samples, our proposed variable selection method is more robust than the ones based on the least squares criterion. Finally, the method is illustrated with numerical simulations.
Model Checking Satellite Operational Procedures
NASA Astrophysics Data System (ADS)
Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri
2011-08-01
We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.
Determinants of preventive oral health behaviour among senior dental students in Nigeria
2013-01-01
Background To study the association between oral health behaviour of senior dental students in Nigeria and their gender, age, knowledge of preventive care, and attitudes towards preventive dentistry. Methods Questionnaires were administered to 179 senior dental students in the six dental schools in Nigeria. The questionnaire obtained information on age, gender, oral self-care, knowledge of preventive dental care and attitudes towards preventive dentistry. Attending a dental clinic for check-up by a dentist or a classmate within the last year was defined as preventive care use. Students who performed oral self-care and attended dental clinic for check-ups were noted to have complied with recommended oral self-care. Chi-square test and binary logistic regression models were used for statistical analyses. Results More male respondents agreed that the use of fluoride toothpaste was more important than the tooth brushing technique for caries prevention (P < 0.001). While the use of dental floss was very low (7.3%), more females were more likely to report using dental floss (p=0.03). Older students were also more likely to comply with recommended oral self-care (p<0.001). In binary regression models, respondents who were younger (p=0.04) and those with higher knowledge of preventive dental care (p=0.008) were more likely to consume sugary snacks less than once a day. Conclusion Gender differences in the awareness of the superiority of using fluoridated toothpaste over brushing in caries prevention; and in the use of dental floss were observed. While older students were more likely to comply with recommended oral self-care measures, younger students with good knowledge of preventive dental care were more likely to consume sugary snacks less than once a day. PMID:23777298
Pinnock, Claude; Yip, Jennifer L. Y.; Khawaja, Anthony P.; Luben, Robert; Hayat, Shabina; Broadway, David C.; Foster, Paul J.; Khaw, Kay-Tee; Wareham, Nick
2016-01-01
ABSTRACT Purpose: To determine if topical beta-blocker use is associated with increased cardiovascular mortality, particularly among people with self-reported glaucoma. Methods: All participants who participated in the first health check (N = 25,639) of the European Prospective Investigation into Cancer (EPIC) Norfolk cohort (1993–2013) were included in this prospective cohort study, with a median follow-up of 17.0 years. We determined use of topical beta-blockers at baseline through a self-reported questionnaire and prescription check at the first clinical visit. Cardiovascular mortality was ascertained through data linkage with the Office for National Statistics mortality database. Hazard ratios (HRs) were estimated using multivariable Cox regression models. Meta-analysis of the present study’s results together with other identified literature was performed using a random effects model. Results: We did not find an association between the use of topical beta-blockers and cardiovascular mortality (HR 0.93, 95% confidence interval, CI, 0.67–1.30). In the 514 participants with self-reported glaucoma, no association was found between the use of topical beta-blockers and cardiovascular mortality (HR 0.89, 95% CI 0.56–1.40). In the primary meta-analysis of four publications, there was no evidence of an association between the use of topical beta-blockers and cardiovascular mortality (pooled HR estimate 1.10, 95% CI 0.84–1.36). Conclusion: Topical beta-blockers do not appear to be associated with excess cardiovascular mortality. This evidence does not indicate that a change in current practice is warranted, although clinicians should continue to assess individual patients and their cardiovascular risk prior to commencing topical beta-blockers. PMID:27551956
Determinants of preventive oral health behaviour among senior dental students in Nigeria.
Folayan, Morenike O; Khami, Mohammad R; Folaranmi, Nkiru; Popoola, Bamidele O; Sofola, Oyinkan O; Ligali, Taofeek O; Esan, Ayodeji O; Orenuga, Omolola O
2013-06-18
To study the association between oral health behaviour of senior dental students in Nigeria and their gender, age, knowledge of preventive care, and attitudes towards preventive dentistry. Questionnaires were administered to 179 senior dental students in the six dental schools in Nigeria. The questionnaire obtained information on age, gender, oral self-care, knowledge of preventive dental care and attitudes towards preventive dentistry. Attending a dental clinic for check-up by a dentist or a classmate within the last year was defined as preventive care use. Students who performed oral self-care and attended dental clinic for check-ups were noted to have complied with recommended oral self-care. Chi-square test and binary logistic regression models were used for statistical analyses. More male respondents agreed that the use of fluoride toothpaste was more important than the tooth brushing technique for caries prevention (P < 0.001). While the use of dental floss was very low (7.3%), more females were more likely to report using dental floss (p=0.03). Older students were also more likely to comply with recommended oral self-care (p<0.001). In binary regression models, respondents who were younger (p=0.04) and those with higher knowledge of preventive dental care (p=0.008) were more likely to consume sugary snacks less than once a day. Gender differences in the awareness of the superiority of using fluoridated toothpaste over brushing in caries prevention; and in the use of dental floss were observed. While older students were more likely to comply with recommended oral self-care measures, younger students with good knowledge of preventive dental care were more likely to consume sugary snacks less than once a day.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakaguchi, Yuji, E-mail: nkgc2003@yahoo.co.jp; Ono, Takeshi; Onitsuka, Ryota
COMPASS system (IBA Dosimetry, Schwarzenbruck, Germany) and ArcCHECK with 3DVH software (Sun Nuclear Corp., Melbourne, FL) are commercial quasi-3-dimensional (3D) dosimetry arrays. Cross-validation to compare them under the same conditions, such as a treatment plan, allows for clear evaluation of such measurement devices. In this study, we evaluated the accuracy of reconstructed dose distributions from the COMPASS system and ArcCHECK with 3DVH software using Monte Carlo simulation (MC) for multi-leaf collimator (MLC) test patterns and clinical VMAT plans. In a phantom study, ArcCHECK 3DVH showed clear differences from COMPASS, measurement and MC due to the detector resolution and the dosemore » reconstruction method. Especially, ArcCHECK 3DVH showed 7% difference from MC for the heterogeneous phantom. ArcCHECK 3DVH only corrects the 3D dose distribution of treatment planning system (TPS) using ArcCHECK measurement, and therefore the accuracy of ArcCHECK 3DVH depends on TPS. In contrast, COMPASS showed good agreement with MC for all cases. However, the COMPASS system requires many complicated installation procedures such as beam modeling, and appropriate commissioning is needed. In terms of clinical cases, there were no large differences for each QA device. The accuracy of the compass and ArcCHECK 3DVH systems for phantoms and clinical cases was compared. Both systems have advantages and disadvantages for clinical use, and consideration of the operating environment is important. The QA system selection is depending on the purpose and workflow in each hospital.« less
Model Checking Abstract PLEXIL Programs with SMART
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.
2007-01-01
We describe a method to automatically generate discrete-state models of abstract Plan Execution Interchange Language (PLEXIL) programs that can be analyzed using model checking tools. Starting from a high-level description of a PLEXIL program or a family of programs with common characteristics, the generator lays the framework that models the principles of program execution. The concrete parts of the program are not automatically generated, but require the modeler to introduce them by hand. As a case study, we generate models to verify properties of the PLEXIL macro constructs that are introduced as shorthand notation. After an exhaustive analysis, we conclude that the macro definitions obey the intended semantics and behave as expected, but contingently on a few specific requirements on the timing semantics of micro-steps in the concrete executive implementation.
Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bujak, Renata; Yumba Mpanga, Arlette; Markuszewski, Marcin; Jacyna, Julia; Matuszewski, Marcin; Kaliszan, Roman; Markuszewski, Michał J
2015-01-01
Prostate cancer (CaP) is a leading cause of cancer deaths in men worldwide. The alarming statistics, the currently applied biomarkers are still not enough specific and selective. In addition, pathogenesis of CaP development is not totally understood. Therefore, in the present work, metabolomics study related to urinary metabolic fingerprinting analyses has been performed in order to scrutinize potential biomarkers that could help in explaining the pathomechanism of the disease and be potentially useful in its diagnosis and prognosis. Urine samples from CaP patients and healthy volunteers were analyzed with the use of high performance liquid chromatography coupled with time of flight mass spectrometry detection (HPLC-TOF/MS) in positive and negative polarity as well as gas chromatography hyphenated with triple quadruple mass spectrometry detection (GC-QqQ/MS) in a scan mode. The obtained data sets were statistically analyzed using univariate and multivariate statistical analyses. The Principal Component Analysis (PCA) was used to check systems' stability and possible outliers, whereas Partial Least Squares Discriminant Analysis (PLS-DA) was performed for evaluation of quality of the model as well as its predictive ability using statistically significant metabolites. The subsequent identification of selected metabolites using NIST library and commonly available databases allows for creation of a list of putative biomarkers and related biochemical pathways they are involved in. The selected pathways, like urea and tricarboxylic acid cycle, amino acid and purine metabolism, can play crucial role in pathogenesis of prostate cancer disease. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kawahara, Hajime; Reese, Erik D.; Kitayama, Tetsu; Sasaki, Shin; Suto, Yasushi
2008-11-01
Our previous analysis indicates that small-scale fluctuations in the intracluster medium (ICM) from cosmological hydrodynamic simulations follow the lognormal probability density function. In order to test the lognormal nature of the ICM directly against X-ray observations of galaxy clusters, we develop a method of extracting statistical information about the three-dimensional properties of the fluctuations from the two-dimensional X-ray surface brightness. We first create a set of synthetic clusters with lognormal fluctuations around their mean profile given by spherical isothermal β-models, later considering polytropic temperature profiles as well. Performing mock observations of these synthetic clusters, we find that the resulting X-ray surface brightness fluctuations also follow the lognormal distribution fairly well. Systematic analysis of the synthetic clusters provides an empirical relation between the three-dimensional density fluctuations and the two-dimensional X-ray surface brightness. We analyze Chandra observations of the galaxy cluster Abell 3667, and find that its X-ray surface brightness fluctuations follow the lognormal distribution. While the lognormal model was originally motivated by cosmological hydrodynamic simulations, this is the first observational confirmation of the lognormal signature in a real cluster. Finally we check the synthetic cluster results against clusters from cosmological hydrodynamic simulations. As a result of the complex structure exhibited by simulated clusters, the empirical relation between the two- and three-dimensional fluctuation properties calibrated with synthetic clusters when applied to simulated clusters shows large scatter. Nevertheless we are able to reproduce the true value of the fluctuation amplitude of simulated clusters within a factor of 2 from their two-dimensional X-ray surface brightness alone. Our current methodology combined with existing observational data is useful in describing and inferring the statistical properties of the three-dimensional inhomogeneity in galaxy clusters.
NASA Astrophysics Data System (ADS)
Coppola, E.; Fantini, A.; Raffaele, F.; Torma, C. Z.; Bacer, S.; Giorgi, F.; Ahrens, B.; Dubois, C.; Sanchez, E.; Verdecchia, M.
2017-12-01
We assess the statistics of different daily precipitation indices in ensembles of Med-CORDEX and EUROCORDEX experiments at high resolution (grid spacing of ˜0.11° , or RCM11) and medium resolution (grid spacing of ˜0.44° , or RCM44) with regional climate models (RCMs) driven by the ERA-Interim reanalysis of observations for the period 1989-2008. The assessment is carried out by comparison with a set of high resolution observation datasets for 9 European subregions. The statistics analyzed include quantitative metrics for mean precipitation, daily precipitation Probability Density Functions (PDFs), daily precipitation intensity, frequency, 95th percentile and 95th percentile of dry spell length. We assess both an ensemble including all Med-CORDEX and EURO-CORDEX models and one including the Med-CORDEX models alone. For the All Models ensembles, the RCM11 one shows a remarkable performance in reproducing the spatial patterns and seasonal cycle of mean precipitation over all regions, with a consistent and marked improvement compared to the RCM44 ensemble and the ERA-Interim reanalysis. A good consistency with observations by the RCM11 ensemble (and a substantial improvement compared to RCM44 and ERA-Interim) is found also for the daily precipitation PDFs, mean intensity and, to a lesser extent, the 95th percentile. In fact, for some regions the RCM11 ensemble overestimates the occurrence of very high intensity events while for one region the models underestimate the occurrence of the largest extremes. The RCM11 ensemble still shows a general tendency to underestimate the dry day frequency and 95th percentile of dry spell length over wetter regions, with only a marginal improvement compared to the lower resolution models. This indicates that the problem of the excessive production of low precipitation events found in many climate models persists also at relatively high resolutions, at least in wet climate regimes. Concerning the Med-CORDEX model ensembles we find that their performance is of similar quality as that of the all-models over the Mediterranean regions analyzed. Finally, we stress the need of consistent and quality checked fine scale observation datasets for the assessment of RCMs run at increasingly high horizontal resolutions.
An approach to checking case-crossover analyses based on equivalence with time-series methods.
Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L
2008-03-01
The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.
Barbieri, Antoine; Anota, Amélie; Conroy, Thierry; Gourgou-Bourgade, Sophie; Juzyna, Beata; Bonnetain, Franck; Lavergne, Christian; Bascoul-Mollevi, Caroline
2016-07-01
A new longitudinal statistical approach was compared to the classical methods currently used to analyze health-related quality-of-life (HRQoL) data. The comparison was made using data in patients with metastatic pancreatic cancer. Three hundred forty-two patients from the PRODIGE4/ACCORD 11 study were randomly assigned to FOLFIRINOX versus gemcitabine regimens. HRQoL was evaluated using the European Organization for Research and Treatment of Cancer (EORTC) QLQ-C30. The classical analysis uses a linear mixed model (LMM), considering an HRQoL score as a good representation of the true value of the HRQoL, following EORTC recommendations. In contrast, built on the item response theory (IRT), our approach considered HRQoL as a latent variable directly estimated from the raw data. For polytomous items, we extended the partial credit model to a longitudinal analysis (longitudinal partial credit model [LPCM]), thereby modeling the latent trait as a function of time and other covariates. Both models gave the same conclusions on 11 of 15 HRQoL dimensions. HRQoL evolution was similar between the 2 treatment arms, except for the symptoms of pain. Indeed, regarding the LPCM, pain perception was significantly less important in the FOLFIRINOX arm than in the gemcitabine arm. For most of the scales, HRQoL changes over time, and no difference was found between treatments in terms of HRQoL. The use of LMM to study the HRQoL score does not seem appropriate. It is an easy-to-use model, but the basic statistical assumptions do not check. Our IRT model may be more complex but shows the same qualities and gives similar results. It has the additional advantage of being more precise and suitable because of its direct use of raw data. © The Author(s) 2015.
Genetic demixing and evolution in linear stepping stone models
NASA Astrophysics Data System (ADS)
Korolev, K. S.; Avlund, Mikkel; Hallatschek, Oskar; Nelson, David R.
2010-04-01
Results for mutation, selection, genetic drift, and migration in a one-dimensional continuous population are reviewed and extended. The population is described by a continuous limit of the stepping stone model, which leads to the stochastic Fisher-Kolmogorov-Petrovsky-Piscounov equation with additional terms describing mutations. Although the stepping stone model was first proposed for population genetics, it is closely related to “voter models” of interest in nonequilibrium statistical mechanics. The stepping stone model can also be regarded as an approximation to the dynamics of a thin layer of actively growing pioneers at the frontier of a colony of micro-organisms undergoing a range expansion on a Petri dish. The population tends to segregate into monoallelic domains. This segregation slows down genetic drift and selection because these two evolutionary forces can only act at the boundaries between the domains; the effects of mutation, however, are not significantly affected by the segregation. Although fixation in the neutral well-mixed (or “zero-dimensional”) model occurs exponentially in time, it occurs only algebraically fast in the one-dimensional model. An unusual sublinear increase is also found in the variance of the spatially averaged allele frequency with time. If selection is weak, selective sweeps occur exponentially fast in both well-mixed and one-dimensional populations, but the time constants are different. The relatively unexplored problem of evolutionary dynamics at the edge of an expanding circular colony is studied as well. Also reviewed are how the observed patterns of genetic diversity can be used for statistical inference and the differences are highlighted between the well-mixed and one-dimensional models. Although the focus is on two alleles or variants, q -allele Potts-like models of gene segregation are considered as well. Most of the analytical results are checked with simulations and could be tested against recent spatial experiments on range expansions of inoculations of Escherichia coli and Saccharomyces cerevisiae.
Bounded Parametric Model Checking for Elementary Net Systems
NASA Astrophysics Data System (ADS)
Knapik, Michał; Szreter, Maciej; Penczek, Wojciech
Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.
Lee, Kyung-Eun; Park, Hyun-Seok
2015-01-01
Epigenetic computational analyses based on Markov chains can integrate dependencies between regions in the genome that are directly adjacent. In this paper, the BED files of fifteen chromatin states of the Broad Histone Track of the ENCODE project are parsed, and comparative nucleotide frequencies of regional chromatin blocks are thoroughly analyzed to detect the Markov property in them. We perform various tests to examine the Markov property embedded in a frequency domain by checking for the presence of the Markov property in the various chromatin states. We apply these tests to each region of the fifteen chromatin states. The results of our simulation indicate that some of the chromatin states possess a stronger Markov property than others. We discuss the significance of our findings in statistical models of nucleotide sequences that are necessary for the computational analysis of functional units in noncoding DNA.
Testing for detailed balance in a financial market
NASA Astrophysics Data System (ADS)
Fiebig, H. R.; Musgrove, D. P.
2015-06-01
We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.
Toropova, Alla P; Toropov, Andrey A; Rallo, Robert; Leszczynska, Danuta; Leszczynski, Jerzy
2015-02-01
The Monte Carlo technique has been used to build up quantitative structure-activity relationships (QSARs) for prediction of dark cytotoxicity and photo-induced cytotoxicity of metal oxide nanoparticles to bacteria Escherichia coli (minus logarithm of lethal concentration for 50% bacteria pLC50, LC50 in mol/L). The representation of nanoparticles include (i) in the case of the dark cytotoxicity a simplified molecular input-line entry system (SMILES), and (ii) in the case of photo-induced cytotoxicity a SMILES plus symbol '^'. The predictability of the approach is checked up with six random distributions of available data into the visible training and calibration sets, and invisible validation set. The statistical characteristics of these models are correlation coefficient 0.90-0.94 (training set) and 0.73-0.98 (validation set). Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Turkoglu, Danyal
Precise knowledge of prompt gamma-ray intensities following neutron capture is critical for elemental and isotopic analyses, homeland security, modeling nuclear reactors, etc. A recently-developed database of prompt gamma-ray production cross sections and nuclear structure information in the form of a decay scheme, called the Evaluated Gamma-ray Activation File (EGAF), is under revision. Statistical model calculations are useful for checking the consistency of the decay scheme, providing insight on its completeness and accuracy. Furthermore, these statistical model calculations are necessary to estimate the contribution of continuum gamma-rays, which cannot be experimentally resolved due to the high density of excited states in medium- and heavy-mass nuclei. Decay-scheme improvements in EGAF lead to improvements to other databases (Evaluated Nuclear Structure Data File, Reference Input Parameter Library) that are ultimately used in nuclear-reaction models to generate the Evaluated Nuclear Data File (ENDF). Gamma-ray transitions following neutron capture in 93Nb have been studied at the cold-neutron beam facility at the Budapest Research Reactor. Measurements have been performed using a coaxial HPGe detector with Compton suppression. Partial gamma-ray production capture cross sections at a neutron velocity of 2200 m/s have been deduced relative to that of the 255.9-keV transition after cold-neutron capture by 93Nb. With the measurement of a niobium chloride target, this partial cross section was internally standardized to the cross section for the 1951-keV transition after cold-neutron capture by 35Cl. The resulting (0.1377 +/- 0.0018) barn (b) partial cross section produced a calibration factor that was 23% lower than previously measured for the EGAF database. The thermal-neutron cross sections were deduced for the 93Nb(n,gamma ) 94mNb and 93Nb(n,gamma) 94gNb reactions by summing the experimentally-measured partial gamma-ray production cross sections associated with the ground-state transitions below the 396-keV level and combining that summation with the contribution to the ground state from the quasi-continuum above 396 keV, determined with Monte Carlo statistical model calculations using the DICEBOX computer code. These values, sigmam and sigma 0, were (0.83 +/- 0.05) b and (1.16 +/- 0.11) b, respectively, and found to be in agreement with literature values. Comparison of the modeled population and experimental depopulation of individual levels confirmed tentative spin assignments and suggested changes where imbalances existed.
Austrian Daily Climate Data Rescue and Quality Control
NASA Astrophysics Data System (ADS)
Jurkovic, A.; Lipa, W.; Adler, S.; Albenberger, J.; Lechner, W.; Swietli, R.; Vossberg, I.; Zehetner, S.
2010-09-01
Checked climate datasets are a "conditio sine qua non" for all projects that are relevant for environment and climate. In the framework of climate change studies and analysis it is essential to work with quality controlled and trustful data. Furthermore these datasets are used as input for various simulation models. In regard to investigations of extreme events, like strong precipitation periods, drought periods and similar ones we need climate data in high temporal resolution (at least in daily resolution). Because of the historical background - during Second World War the majority of our climate sheets were sent to Berlin, where the historical sheets were destroyed by a bomb attack and so important information got lost - only several climate sheets, mostly duplicates, before 1939 are available and stored in our climate data archive. In 1970 the Central Institute for Meteorology and Geodynamics in Vienna started a first attempt to digitize climate data by means of punch cards. With the introduction of a routinely climate data quality control in 1984 we can speak of high-class-checked daily data (finally checked data, quality flag 6). Our group is working on the processing of digitization and quality control of the historical data for the period 1872 to 1983 for 18 years. Since 2007 it was possible to intensify the work (processes) in the framework of an internal project, namely Austrian Climate Data Rescue and Quality Control. The aim of this initiative was - and still is - to supply daily data in an outstanding good and uniform quality. So this project is a kind of pre-project for all scientific projects which are working with daily data. In addition to routine quality checks (that are running since 1984) using the commercial Bull Software we are testing our data with additional open source software, namely ProClim.db. By the use of this spatial and statistical test procedure, the elements air temperature and precipitation - for several sites in Carinthia - could already be checked, flagged and corrected. Checking the output (so called- error list) of ProClim is very time consuming and needs trained staff; however, in last instance it is necessary. Due to the guideline "Your archive is your business card for quality" the sub-project NEW ARCHIVE was initialized and started at the end of 2009. Our paper archive contains historical, up to 150 year-old, climate sheets that are valuable cultural assets. Unfortunately the storage of these historical and actual data treasures turned out to be more than suboptimal (insufficient protection against dust, dirt, humidity and light incidence). Because of this fact a concept for a new storage system and archive database was generated and already partly realized. In a nutshell this presentation shows on the one hand the importance of recovering historical climate sheets for climate change research - even if it is exhausting and time consuming - and gives on the other hand a general overview of used quality control procedures at our institute.
A Bayesian approach to estimate the biomass of anchovies off the coast of Perú.
Quiroz, Zaida C; Prates, Marcos O; Rue, Håvard
2015-03-01
The Northern Humboldt Current System (NHCS) is the world's most productive ecosystem in terms of fish. In particular, the Peruvian anchovy (Engraulis ringens) is the major prey of the main top predators, like seabirds, fish, humans, and other mammals. In this context, it is important to understand the dynamics of the anchovy distribution to preserve it as well as to exploit its economic capacities. Using the data collected by the "Instituto del Mar del Perú" (IMARPE) during a scientific survey in 2005, we present a statistical analysis that has as main goals: (i) to adapt to the characteristics of the sampled data, such as spatial dependence, high proportions of zeros and big size of samples; (ii) to provide important insights on the dynamics of the anchovy population; and (iii) to propose a model for estimation and prediction of anchovy biomass in the NHCS offshore from Perú. These data were analyzed in a Bayesian framework using the integrated nested Laplace approximation (INLA) method. Further, to select the best model and to study the predictive power of each model, we performed model comparisons and predictive checks, respectively. Finally, we carried out a Bayesian spatial influence diagnostic for the preferred model. © 2014, The International Biometric Society.
NASA Technical Reports Server (NTRS)
Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Amador, Arthur V.; Spitale, Joseph N.
1993-01-01
Robotic spacecraft are controlled by sets of commands called 'sequences.' These sequences must be checked against mission constraints. Making our existing constraint checking program faster would enable new capabilities in our uplink process. Therefore, we are rewriting this program to run on a parallel computer. To do so, we had to determine how to run constraint-checking algorithms in parallel and create a new method of specifying spacecraft models and constraints. This new specification gives us a means of representing flight systems and their predicted response to commands which could be used in a variety of applications throughout the command process, particularly during anomaly or high-activity operations. This commonality could reduce operations cost and risk for future complex missions. Lessons learned in applying some parts of this system to the TOPEX/Poseidon mission will be described.
[Model for unplanned self extubation of ICU patients using system dynamics approach].
Song, Yu Gil; Yun, Eun Kyoung
2015-04-01
In this study a system dynamics methodology was used to identify correlation and nonlinear feedback structure among factors affecting unplanned extubation (UE) of ICU patients and to construct and verify a simulation model. Factors affecting UE were identified through a theoretical background established by reviewing literature and preceding studies and referencing various statistical data. Related variables were decided through verification of content validity by an expert group. A causal loop diagram (CLD) was made based on the variables. Stock & Flow modeling using Vensim PLE Plus Version 6.0 b was performed to establish a model for UE. Based on the literature review and expert verification, 18 variables associated with UE were identified and CLD was prepared. From the prepared CLD, a model was developed by converting to the Stock & Flow Diagram. Results of the simulation showed that patient stress, patient in an agitated state, restraint application, patient movability, and individual intensive nursing were variables giving the greatest effect to UE probability. To verify agreement of the UE model with real situations, simulation with 5 cases was performed. Equation check and sensitivity analysis on TIME STEP were executed to validate model integrity. Results show that identification of a proper model enables prediction of UE probability. This prediction allows for adjustment of related factors, and provides basic data do develop nursing interventions to decrease UE.
A new statistical methodology predicting chip failure probability considering electromigration
NASA Astrophysics Data System (ADS)
Sun, Ted
In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.
Prehospital Use of Plasma for Traumatic Hemorrhage
2014-06-01
combination of automated scanning and careful checking by eye, Bush, Gentry, and Glass converted the marks and free text on the surveys into an electronic ...available for later review during transfusion reaction investigations. • All study subjects’ blood typing results will be placed in their electronic ...from the study coordinator. We converted answers marked by hand, on paper, into an electronic format that was accessible to statistical methods. Then
Image Filtering with Boolean and Statistical Operators.
1983-12-01
S3(2) COMPLEX AMAT(256, 4). BMAT (256. 4). CMAT(256. 4) CALL IOF(3. MAIN. AFLNM. DFLNI, CFLNM. MS., 82, S3) CALL OPEN(1.AFLNM* 1.IER) CALL CHECKC!ER...RDBLK(2. 6164. MAT. 16, IER) CALL CHECK(IER) DO I K-1. 4 DO I J-1.256 CMAT(J. K)-AMAT(J. K)’. BMAT (J. K) I CONTINUE S CALL WRBLK(3. 164!. CMAT. 16. IER
NASA Astrophysics Data System (ADS)
Ramezanpour, Abolfazl; Mashaghi, Alireza
2017-07-01
A fundamental problem in medicine and biology is to assign states, e.g. healthy or diseased, to cells, organs or individuals. State assignment or making a diagnosis is often a nontrivial and challenging process and, with the advent of omics technologies, the diagnostic challenge is becoming more and more serious. The challenge lies not only in the increasing number of measured properties and dynamics of the system (e.g. cell or human body) but also in the co-evolution of multiple states and overlapping properties, and degeneracy of states. We develop, from first principles, a generic rational framework for state assignment in cell biology and medicine, and demonstrate its applicability with a few simple theoretical case studies from medical diagnostics. We show how disease-related statistical information can be used to build a comprehensive model that includes the relevant dependencies between clinical and laboratory findings (signs) and diseases. In particular, we include disease-disease and sign-sign interactions and study how one can infer the probability of a disease in a patient with given signs. We perform comparative analysis with simple benchmark models to check the performances of our models. We find that including interactions can significantly change the statistical importance of the signs and diseases. This first principles approach, as we show, facilitates the early diagnosis of disease by taking interactions into accounts, and enables the construction of consensus diagnostic flow charts. Additionally, we envision that our approach will find applications in systems biology, and in particular, in characterizing the phenome via the metabolome, the proteome, the transcriptome, and the genome.
A flexible bayesian model for testing for transmission ratio distortion.
Casellas, Joaquim; Manunza, Arianna; Mercader, Anna; Quintanilla, Raquel; Amills, Marcel
2014-12-01
Current statistical approaches to investigate the nature and magnitude of transmission ratio distortion (TRD) are scarce and restricted to the most common experimental designs such as F2 populations and backcrosses. In this article, we describe a new Bayesian approach to check TRD within a given biallelic genetic marker in a diploid species, providing a highly flexible framework that can accommodate any kind of population structure. This model relies on the genotype of each offspring and thus integrates all available information from either the parents' genotypes or population-specific allele frequencies and yields TRD estimates that can be corroborated by the calculation of a Bayes factor (BF). This approach has been evaluated on simulated data sets with appealing statistical performance. As a proof of concept, we have also tested TRD in a porcine population with five half-sib families and 352 offspring. All boars and piglets were genotyped with the Porcine SNP60 BeadChip, whereas genotypes from the sows were not available. The SNP-by-SNP screening of the pig genome revealed 84 SNPs with decisive evidences of TRD (BF > 100) after accounting for multiple testing. Many of these regions contained genes related to biological processes (e.g., nucleosome assembly and co-organization, DNA conformation and packaging, and DNA complex assembly) that are critically associated with embryonic viability. The implementation of this method, which overcomes many of the limitations of previous approaches, should contribute to fostering research on TRD in both model and nonmodel organisms. Copyright © 2014 by the Genetics Society of America.
ERIC Educational Resources Information Center
Barrett, Jeffrey E.; Sarama, Julie; Clements, Douglas H.; Cullen, Craig; McCool, Jenni; Witkowski-Rumsey, Chepina; Klanderman, David
2012-01-01
We examined children's development of strategic and conceptual knowledge for linear measurement. We conducted teaching experiments with eight students in grades 2 and 3, based on our hypothetical learning trajectory for length to check its coherence and to strengthen the domain-specific model for learning and teaching. We checked the hierarchical…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-08
... and torque checks of the hanger fittings and strut forward bulkhead of the forward engine mount and... requires repetitive inspections and torque checks of the hanger fittings and strut forward bulkhead of the... corrective actions are replacing the fasteners; removing loose fasteners; tightening all Group A [[Page 39187...
Landy, Rebecca; Cheung, Li C; Schiffman, Mark; Gage, Julia C; Hyun, Noorie; Wentzensen, Nicolas; Kinney, Walter K; Castle, Philip E; Fetterman, Barbara; Poitras, Nancy E; Lorey, Thomas; Sasieni, Peter D; Katki, Hormuzd A
2018-06-01
Electronic health-records (EHR) are increasingly used by epidemiologists studying disease following surveillance testing to provide evidence for screening intervals and referral guidelines. Although cost-effective, undiagnosed prevalent disease and interval censoring (in which asymptomatic disease is only observed at the time of testing) raise substantial analytic issues when estimating risk that cannot be addressed using Kaplan-Meier methods. Based on our experience analysing EHR from cervical cancer screening, we previously proposed the logistic-Weibull model to address these issues. Here we demonstrate how the choice of statistical method can impact risk estimates. We use observed data on 41,067 women in the cervical cancer screening program at Kaiser Permanente Northern California, 2003-2013, as well as simulations to evaluate the ability of different methods (Kaplan-Meier, Turnbull, Weibull and logistic-Weibull) to accurately estimate risk within a screening program. Cumulative risk estimates from the statistical methods varied considerably, with the largest differences occurring for prevalent disease risk when baseline disease ascertainment was random but incomplete. Kaplan-Meier underestimated risk at earlier times and overestimated risk at later times in the presence of interval censoring or undiagnosed prevalent disease. Turnbull performed well, though was inefficient and not smooth. The logistic-Weibull model performed well, except when event times didn't follow a Weibull distribution. We have demonstrated that methods for right-censored data, such as Kaplan-Meier, result in biased estimates of disease risks when applied to interval-censored data, such as screening programs using EHR data. The logistic-Weibull model is attractive, but the model fit must be checked against Turnbull non-parametric risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Varet, Hugo; Brillet-Guéguen, Loraine; Coppée, Jean-Yves; Dillies, Marie-Agnès
2016-01-01
Several R packages exist for the detection of differentially expressed genes from RNA-Seq data. The analysis process includes three main steps, namely normalization, dispersion estimation and test for differential expression. Quality control steps along this process are recommended but not mandatory, and failing to check the characteristics of the dataset may lead to spurious results. In addition, normalization methods and statistical models are not exchangeable across the packages without adequate transformations the users are often not aware of. Thus, dedicated analysis pipelines are needed to include systematic quality control steps and prevent errors from misusing the proposed methods. SARTools is an R pipeline for differential analysis of RNA-Seq count data. It can handle designs involving two or more conditions of a single biological factor with or without a blocking factor (such as a batch effect or a sample pairing). It is based on DESeq2 and edgeR and is composed of an R package and two R script templates (for DESeq2 and edgeR respectively). Tuning a small number of parameters and executing one of the R scripts, users have access to the full results of the analysis, including lists of differentially expressed genes and a HTML report that (i) displays diagnostic plots for quality control and model hypotheses checking and (ii) keeps track of the whole analysis process, parameter values and versions of the R packages used. SARTools provides systematic quality controls of the dataset as well as diagnostic plots that help to tune the model parameters. It gives access to the main parameters of DESeq2 and edgeR and prevents untrained users from misusing some functionalities of both packages. By keeping track of all the parameters of the analysis process it fits the requirements of reproducible research.
Visual Predictive Check in Models with Time-Varying Input Function.
Largajolli, Anna; Bertoldo, Alessandra; Campioni, Marco; Cobelli, Claudio
2015-11-01
The nonlinear mixed effects models are commonly used modeling techniques in the pharmaceutical research as they enable the characterization of the individual profiles together with the population to which the individuals belong. To ensure a correct use of them is fundamental to provide powerful diagnostic tools that are able to evaluate the predictive performance of the models. The visual predictive check (VPC) is a commonly used tool that helps the user to check by visual inspection if the model is able to reproduce the variability and the main trend of the observed data. However, the simulation from the model is not always trivial, for example, when using models with time-varying input function (IF). In this class of models, there is a potential mismatch between each set of simulated parameters and the associated individual IF which can cause an incorrect profile simulation. We introduce a refinement of the VPC by taking in consideration a correlation term (the Mahalanobis or normalized Euclidean distance) that helps the association of the correct IF with the individual set of simulated parameters. We investigate and compare its performance with the standard VPC in models of the glucose and insulin system applied on real and simulated data and in a simulated pharmacokinetic/pharmacodynamic (PK/PD) example. The newly proposed VPC performance appears to be better with respect to the standard VPC especially for the models with big variability in the IF where the probability of simulating incorrect profiles is higher.
Model Checker for Java Programs
NASA Technical Reports Server (NTRS)
Visser, Willem
2007-01-01
Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.
Combining Static Analysis and Model Checking for Software Analysis
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)
2003-01-01
We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.
A systematic review of models to predict recruitment to multicentre clinical trials.
Barnard, Katharine D; Dent, Louise; Cook, Andrew
2010-07-06
Less than one third of publicly funded trials managed to recruit according to their original plan often resulting in request for additional funding and/or time extensions. The aim was to identify models which might be useful to a major public funder of randomised controlled trials when estimating likely time requirements for recruiting trial participants. The requirements of a useful model were identified as usability, based on experience, able to reflect time trends, accounting for centre recruitment and contribution to a commissioning decision. A systematic review of English language articles using MEDLINE and EMBASE. Search terms included: randomised controlled trial, patient, accrual, predict, enroll, models, statistical; Bayes Theorem; Decision Theory; Monte Carlo Method and Poisson. Only studies discussing prediction of recruitment to trials using a modelling approach were included. Information was extracted from articles by one author, and checked by a second, using a pre-defined form. Out of 326 identified abstracts, only 8 met all the inclusion criteria. Of these 8 studies examined, there are five major classes of model discussed: the unconditional model, the conditional model, the Poisson model, Bayesian models and Monte Carlo simulation of Markov models. None of these meet all the pre-identified needs of the funder. To meet the needs of a number of research programmes, a new model is required as a matter of importance. Any model chosen should be validated against both retrospective and prospective data, to ensure the predictions it gives are superior to those currently used.
Accuracy of laser-scanned models compared to plaster models and cone-beam computed tomography.
Kim, Jooseong; Heo, Giseon; Lagravère, Manuel O
2014-05-01
To compare the accuracy of measurements obtained from the three-dimensional (3D) laser scans to those taken from the cone-beam computed tomography (CBCT) scans and those obtained from plaster models. Eighteen different measurements, encompassing mesiodistal width of teeth and both maxillary and mandibular arch length and width, were selected using various landmarks. CBCT scans and plaster models were prepared from 60 patients. Plaster models were scanned using the Ortho Insight 3D laser scanner, and the selected landmarks were measured using its software. CBCT scans were imported and analyzed using the Avizo software, and the 26 landmarks corresponding to the selected measurements were located and recorded. The plaster models were also measured using a digital caliper. Descriptive statistics and intraclass correlation coefficient (ICC) were used to analyze the data. The ICC result showed that the values obtained by the three different methods were highly correlated in all measurements, all having correlations>0.808. When checking the differences between values and methods, the largest mean difference found was 0.59 mm±0.38 mm. In conclusion, plaster models, CBCT models, and laser-scanned models are three different diagnostic records, each with its own advantages and disadvantages. The present results showed that the laser-scanned models are highly accurate to plaster models and CBCT scans. This gives general clinicians an alternative to take into consideration the advantages of laser-scanned models over plaster models and CBCT reconstructions.
2012-01-01
Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint. PMID:22962944
Adrion, Christine; Mansmann, Ulrich
2012-09-10
A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.
Bender, Anne Mette; Kawachi, Ichiro; Jørgensen, Torben; Pisinger, Charlotta
2015-07-22
Participation in population-based preventive health check has declined over the past decades. More research is needed to determine factors enhancing participation. The objective of this study was to examine the association between two measures of neighborhood level social capital on participation in the health check phase of a population-based lifestyle intervention. The study population comprised 12,568 residents of 73 Danish neighborhoods in the intervention group of a large population-based lifestyle intervention study - the Inter99. Two measures of social capital were applied; informal socializing and voting turnout. In a multilevel analysis only adjusting for age and sex, a higher level of neighborhood social capital was associated with higher probability of participating in the health check. Inclusion of both individual socioeconomic position and neighborhood deprivation in the model attenuated the coefficients for informal socializing, while voting turnout became non-significant. Higher level of neighborhood social capital was associated with higher probability of participating in the health check phase of a population-based lifestyle intervention. Most of the association between neighborhood social capital and participation in preventive health checks can be explained by differences in individual socioeconomic position and level of neighborhood deprivation. Nonetheless, there seems to be some residual association between social capital and health check participation, suggesting that activating social relations in the community may be an avenue for boosting participation rates in population-based health checks. ClinicalTrials.gov (registration no. NCT00289237 ).
Microscopic analysis and simulation of check-mark stain on the galvanized steel strip
NASA Astrophysics Data System (ADS)
So, Hongyun; Yoon, Hyun Gi; Chung, Myung Kyoon
2010-11-01
When galvanized steel strip is produced through a continuous hot-dip galvanizing process, the thickness of adhered zinc film is controlled by plane impinging air gas jet referred to as "air-knife system". In such a gas-jet wiping process, stain of check-mark or sag line shape frequently appears. The check-mark defect is caused by non-uniform zinc coating and the oblique patterns such as "W", "V" or "X" on the coated surface. The present paper presents a cause and analysis of the check-mark formation and a numerical simulation of sag lines by using the numerical data produced by Large Eddy Simulation (LES) of the three-dimensional compressible turbulent flow field around the air-knife system. It was found that there is alternating plane-wise vortices near the impinging stagnation region and such alternating vortices move almost periodically to the right and to the left sides on the stagnation line due to the jet flow instability. Meanwhile, in order to simulate the check-mark formation, a novel perturbation model has been developed to predict the variation of coating thickness along the transverse direction. Finally, the three-dimensional zinc coating surface was obtained by the present perturbation model. It was found that the sag line formation is determined by the combination of the instantaneous coating thickness distribution along the transverse direction near the stagnation line and the feed speed of the steel strip.
Predictors of Health Service Utilization Among Older Men in Jamaica.
Willie-Tyndale, Douladel; McKoy Davis, Julian; Holder-Nevins, Desmalee; Mitchell-Fearon, Kathryn; James, Kenneth; Waldron, Norman K; Eldemire-Shearer, Denise
2018-01-03
To determine the relative influence of sociodemographic, socioeconomic, psychosocial, and health variables on health service utilization in the last 12 months. Data were analyzed for 1,412 men ≥60 years old from a 2012 nationally representative community-based survey in Jamaica. Associations between six health service utilization variables and several explanatory variables were explored. Logistic regression models were used to identify independent predictors of each utilization measure and determine the strengths of associations. More than 75% reported having health visits and blood pressure checks. Blood sugar (69.6%) and cholesterol (63.1%) checks were less common, and having a prostate check (35.1%) was the least utilized service. Adjusted models confirmed that the presence of chronic diseases and health insurance most strongly predicted utilization. A daughter or son as the main source of financial support (vs self) doubled or tripled, respectively, the odds of routine doctors' visits. Compared with primary or lower education, tertiary education doubled [2.37 (1.12, 4.95)] the odds of a blood pressure check. Regular attendance at club/society/religious organizations' meetings increased the odds of having a prostate check by 45%. Although need and financial resources most strongly influenced health service utilization, psychosocial variables may be particularly influential for underutilized services. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Historical trends and high-resolution future climate projections in northern Tuscany (Italy)
NASA Astrophysics Data System (ADS)
D'Oria, Marco; Ferraresi, Massimo; Tanda, Maria Giovanna
2017-12-01
This paper analyzes the historical precipitation and temperature trends and the future climate projections with reference to the northern part of Tuscany (Italy). The trends are identified and quantified at monthly and annual scale at gauging stations with data collected for long periods (60-90 years). An ensemble of 13 Regional Climate Models (RCMs), based on two Representative Concentration Pathways (RCP4.5 and RCP8.5), was then used to assess local scale future precipitation and temperature projections and to represent the uncertainty in the results. The historical data highlight a general decrease of the annual rainfall at a mean rate of 22 mm per decade but, in many cases, the tendencies are not statistically significant. Conversely, the annual mean temperature exhibits an upward trend, statistically significant in the majority of cases, with a warming rate of about 0.1 °C per decade. With reference to the model projections and the annual precipitation, the results are not concordant; the deviations between models in the same period are higher than the future changes at medium- (2031-2040) and long-term (2051-2060) and highlight that the model uncertainty and variability is high. According to the climate model projections, the warming of the study area is unequivocal; a mean positive increment of 0.8 °C at medium-term and 1.1 °C at long-term is expected with respect to the reference period (2003-2012) and the scenario RCP4.5; the increments grow to 0.9 °C and 1.9 °C for the RCP8.5. Finally, in order to check the observed climate change signals, the climate model projections were compared with the trends based on the historical data. A satisfactory agreement is obtained with reference to the precipitation; a systematic underestimation of the trend values with respect to the models, at medium- and long-term, is observed for the temperature data.