Sample records for effects model analysis

  1. A Cost-Effectiveness Analysis Model for Evaluating and Planning Secondary Vocational Programs

    ERIC Educational Resources Information Center

    Kim, Jin Eun

    1977-01-01

    This paper conceptualizes a cost-effectiveness analysis and describes a cost-effectiveness analysis model for secondary vocational programs. It generates three kinds of cost-effectiveness measures: program effectiveness, cost efficiency, and cost-effectiveness and/or performance ratio. (Author)

  2. The HIV Cure Research Agenda: The Role of Mathematical Modelling and Cost-Effectiveness Analysis.

    PubMed

    Freedberg, Kenneth A; Possas, Cristina; Deeks, Steven; Ross, Anna Laura; Rosettie, Katherine L; Di Mascio, Michele; Collins, Chris; Walensky, Rochelle P; Yazdanpanah, Yazdan

    The research agenda towards an HIV cure is building rapidly. In this article, we discuss the reasons for and methodological approach to using mathematical modeling and cost-effectiveness analysis in this agenda. We provide a brief description of the proof of concept for cure and the current directions of cure research. We then review the types of clinical economic evaluations, including cost analysis, cost-benefit analysis, and cost-effectiveness analysis. We describe the use of mathematical modeling and cost-effectiveness analysis early in the HIV epidemic as well as in the era of combination antiretroviral therapy. We then highlight the novel methodology of Value of Information analysis and its potential role in the planning of clinical trials. We close with recommendations for modeling and cost-effectiveness analysis in the HIV cure agenda.

  3. A random effects meta-analysis model with Box-Cox transformation.

    PubMed

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The random effects meta-analysis with the Box-Cox transformation may be an important tool for examining robustness of traditional meta-analysis results against skewness on the observed treatment effect estimates. Further critical evaluation of the method is needed.

  4. Hierarchical model analysis of the Atlantic Flyway Breeding Waterfowl Survey

    USGS Publications Warehouse

    Sauer, John R.; Zimmerman, Guthrie S.; Klimstra, Jon D.; Link, William A.

    2014-01-01

    We used log-linear hierarchical models to analyze data from the Atlantic Flyway Breeding Waterfowl Survey. The survey has been conducted by state biologists each year since 1989 in the northeastern United States from Virginia north to New Hampshire and Vermont. Although yearly population estimates from the survey are used by the United States Fish and Wildlife Service for estimating regional waterfowl population status for mallards (Anas platyrhynchos), black ducks (Anas rubripes), wood ducks (Aix sponsa), and Canada geese (Branta canadensis), they are not routinely adjusted to control for time of day effects and other survey design issues. The hierarchical model analysis permits estimation of year effects and population change while accommodating the repeated sampling of plots and controlling for time of day effects in counting. We compared population estimates from the current stratified random sample analysis to population estimates from hierarchical models with alternative model structures that describe year to year changes as random year effects, a trend with random year effects, or year effects modeled as 1-year differences. Patterns of population change from the hierarchical model results generally were similar to the patterns described by stratified random sample estimates, but significant visibility differences occurred between twilight to midday counts in all species. Controlling for the effects of time of day resulted in larger population estimates for all species in the hierarchical model analysis relative to the stratified random sample analysis. The hierarchical models also provided a convenient means of estimating population trend as derived statistics from the analysis. We detected significant declines in mallard and American black ducks and significant increases in wood ducks and Canada geese, a trend that had not been significant for 3 of these 4 species in the prior analysis. We recommend using hierarchical models for analysis of the Atlantic Flyway Breeding Waterfowl Survey.

  5. 2012 National Park visitor spending effects: economic contributions to local communities, states, and the nation

    USGS Publications Warehouse

    Cullinane Thomas, Catherine; Huber, Christopher C.; Koontz, Lynne

    2014-01-01

    This 2012 analysis marks a major revision to the NPS visitor spending effects analyses, with the development of a new visitor spending effects model (VSE model) that replaces the former Money Generation Model (MGM2). Many of the hallmarks and processes of the MGM2 model are preserved in the new VSE model, but the new model makes significant strides in improving the accuracy and transparency of the analysis. Because of this change from the MGM2 model to the VSE model, estimates from this year’s analysis are not directly comparable to previous analyses.

  6. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses

    PubMed Central

    Soares, Marta O.; Palmer, Stephen; Ades, Anthony E.; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M.

    2015-01-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. PMID:25712447

  7. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses.

    PubMed

    Welton, Nicky J; Soares, Marta O; Palmer, Stephen; Ades, Anthony E; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M

    2015-07-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. © The Author(s) 2015.

  8. A Bayesian Nonparametric Meta-Analysis Model

    ERIC Educational Resources Information Center

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  9. FMCSA safety program effectiveness measurement compliance review effectiveness model results for carriers with compliance reviews in fiscal year 2009 : [analysis brief].

    DOT National Transportation Integrated Search

    2014-04-01

    This Analysis Brief documents the methodology and results from the Compliance Review Effectiveness Model (CREM) for carriers receiving CRs in fiscal year (FY) 2009. The model measures the effectiveness of the compliance review (CR) program, one of th...

  10. Effect Sizes for Growth-Modeling Analysis for Controlled Clinical Trials in the Same Metric as for Classical Analysis

    ERIC Educational Resources Information Center

    Feingold, Alan

    2009-01-01

    The use of growth-modeling analysis (GMA)--including hierarchical linear models, latent growth models, and general estimating equations--to evaluate interventions in psychology, psychiatry, and prevention science has grown rapidly over the last decade. However, an effect size associated with the difference between the trajectories of the…

  11. Macro-level pedestrian and bicycle crash analysis: Incorporating spatial spillover effects in dual state count models.

    PubMed

    Cai, Qing; Lee, Jaeyoung; Eluru, Naveen; Abdel-Aty, Mohamed

    2016-08-01

    This study attempts to explore the viability of dual-state models (i.e., zero-inflated and hurdle models) for traffic analysis zones (TAZs) based pedestrian and bicycle crash frequency analysis. Additionally, spatial spillover effects are explored in the models by employing exogenous variables from neighboring zones. The dual-state models such as zero-inflated negative binomial and hurdle negative binomial models (with and without spatial effects) are compared with the conventional single-state model (i.e., negative binomial). The model comparison for pedestrian and bicycle crashes revealed that the models that considered observed spatial effects perform better than the models that did not consider the observed spatial effects. Across the models with spatial spillover effects, the dual-state models especially zero-inflated negative binomial model offered better performance compared to single-state models. Moreover, the model results clearly highlighted the importance of various traffic, roadway, and sociodemographic characteristics of the TAZ as well as neighboring TAZs on pedestrian and bicycle crash frequency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Bias and inference from misspecified mixed-effect models in stepped wedge trial analysis.

    PubMed

    Thompson, Jennifer A; Fielding, Katherine L; Davey, Calum; Aiken, Alexander M; Hargreaves, James R; Hayes, Richard J

    2017-10-15

    Many stepped wedge trials (SWTs) are analysed by using a mixed-effect model with a random intercept and fixed effects for the intervention and time periods (referred to here as the standard model). However, it is not known whether this model is robust to misspecification. We simulated SWTs with three groups of clusters and two time periods; one group received the intervention during the first period and two groups in the second period. We simulated period and intervention effects that were either common-to-all or varied-between clusters. Data were analysed with the standard model or with additional random effects for period effect or intervention effect. In a second simulation study, we explored the weight given to within-cluster comparisons by simulating a larger intervention effect in the group of the trial that experienced both the control and intervention conditions and applying the three analysis models described previously. Across 500 simulations, we computed bias and confidence interval coverage of the estimated intervention effect. We found up to 50% bias in intervention effect estimates when period or intervention effects varied between clusters and were treated as fixed effects in the analysis. All misspecified models showed undercoverage of 95% confidence intervals, particularly the standard model. A large weight was given to within-cluster comparisons in the standard model. In the SWTs simulated here, mixed-effect models were highly sensitive to departures from the model assumptions, which can be explained by the high dependence on within-cluster comparisons. Trialists should consider including a random effect for time period in their SWT analysis model. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  13. Bias and inference from misspecified mixed‐effect models in stepped wedge trial analysis

    PubMed Central

    Fielding, Katherine L.; Davey, Calum; Aiken, Alexander M.; Hargreaves, James R.; Hayes, Richard J.

    2017-01-01

    Many stepped wedge trials (SWTs) are analysed by using a mixed‐effect model with a random intercept and fixed effects for the intervention and time periods (referred to here as the standard model). However, it is not known whether this model is robust to misspecification. We simulated SWTs with three groups of clusters and two time periods; one group received the intervention during the first period and two groups in the second period. We simulated period and intervention effects that were either common‐to‐all or varied‐between clusters. Data were analysed with the standard model or with additional random effects for period effect or intervention effect. In a second simulation study, we explored the weight given to within‐cluster comparisons by simulating a larger intervention effect in the group of the trial that experienced both the control and intervention conditions and applying the three analysis models described previously. Across 500 simulations, we computed bias and confidence interval coverage of the estimated intervention effect. We found up to 50% bias in intervention effect estimates when period or intervention effects varied between clusters and were treated as fixed effects in the analysis. All misspecified models showed undercoverage of 95% confidence intervals, particularly the standard model. A large weight was given to within‐cluster comparisons in the standard model. In the SWTs simulated here, mixed‐effect models were highly sensitive to departures from the model assumptions, which can be explained by the high dependence on within‐cluster comparisons. Trialists should consider including a random effect for time period in their SWT analysis model. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28556355

  14. Wayside Bearing Fault Diagnosis Based on a Data-Driven Doppler Effect Eliminator and Transient Model Analysis

    PubMed Central

    Liu, Fang; Shen, Changqing; He, Qingbo; Zhang, Ao; Liu, Yongbin; Kong, Fanrang

    2014-01-01

    A fault diagnosis strategy based on the wayside acoustic monitoring technique is investigated for locomotive bearing fault diagnosis. Inspired by the transient modeling analysis method based on correlation filtering analysis, a so-called Parametric-Mother-Doppler-Wavelet (PMDW) is constructed with six parameters, including a center characteristic frequency and five kinematic model parameters. A Doppler effect eliminator containing a PMDW generator, a correlation filtering analysis module, and a signal resampler is invented to eliminate the Doppler effect embedded in the acoustic signal of the recorded bearing. Through the Doppler effect eliminator, the five kinematic model parameters can be identified based on the signal itself. Then, the signal resampler is applied to eliminate the Doppler effect using the identified parameters. With the ability to detect early bearing faults, the transient model analysis method is employed to detect localized bearing faults after the embedded Doppler effect is eliminated. The effectiveness of the proposed fault diagnosis strategy is verified via simulation studies and applications to diagnose locomotive roller bearing defects. PMID:24803197

  15. Using structural equation modeling for network meta-analysis.

    PubMed

    Tu, Yu-Kang; Wu, Yun-Chun

    2017-07-14

    Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.

  16. Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis.

    PubMed

    Crowther, Michael J; Look, Maxime P; Riley, Richard D

    2014-09-28

    Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

  17. A Regression Framework for Effect Size Assessments in Longitudinal Modeling of Group Differences

    PubMed Central

    Feingold, Alan

    2013-01-01

    The use of growth modeling analysis (GMA)--particularly multilevel analysis and latent growth modeling--to test the significance of intervention effects has increased exponentially in prevention science, clinical psychology, and psychiatry over the past 15 years. Model-based effect sizes for differences in means between two independent groups in GMA can be expressed in the same metric (Cohen’s d) commonly used in classical analysis and meta-analysis. This article first reviews conceptual issues regarding calculation of d for findings from GMA and then introduces an integrative framework for effect size assessments that subsumes GMA. The new approach uses the structure of the linear regression model, from which effect sizes for findings from diverse cross-sectional and longitudinal analyses can be calculated with familiar statistics, such as the regression coefficient, the standard deviation of the dependent measure, and study duration. PMID:23956615

  18. Effect of High-Frequency Transcranial Magnetic Stimulation on Craving in Substance Use Disorder: A Meta-Analysis.

    PubMed

    Maiti, Rituparna; Mishra, Biswa Ranjan; Hota, Debasish

    2017-01-01

    Repetitive transcranial magnetic stimulation (rTMS), a noninvasive, neuromodulatory tool, has been used to reduce craving in different substance use disorders. There are some studies that have reported conflicting and inconclusive results; therefore, this meta-analysis was conducted to evaluate the effect of high-frequency rTMS on craving in substance use disorder and to investigate the reasons behind the inconsistency across the studies. The authors searched clinical trials from MEDLINE, Cochrane databases, and International Clinical Trials Registry Platform. The PRISMA guidelines, as well as recommended meta-analysis practices, were followed in the selection process, analysis, and reporting of the findings. The effect estimate used was the standardized mean difference (Hedge's g), and heterogeneity across the considered studies was explored using subgroup analyses. The quality assessment was done using the Cochrane risk of bias tool, and sensitivity analysis was performed to check the influences on effect size by statistical models. After screening and assessment of eligibility, finally 10 studies were included for meta-analysis, which includes six studies on alcohol and four studies on nicotine use disorder. The random-model analysis revealed a pooled effect size of 0.75 (95% CI=0.29 to 1.21, p=0.001), whereas the fixed-model analysis showed a large effect size of 0.87 (95% CI=0.63 to 1.12, p<0.00001). Subgroup analysis for alcohol use disorder showed an effect size of -0.06 (95% CI=-0.89 to 0.77, p=0.88). In the case of nicotine use disorder, random-model analysis revealed an effect size of 1.00 (95% CI=0.48 to 1.55, p=0.0001), whereas fixed-model analysis also showed a large effect size of 0.96 (95% CI=0.71 to 1.22). The present meta-analysis identified a beneficial effect of high-frequency rTMS on craving associated with nicotine use disorder but not alcohol use disorder.

  19. On Two-Dimensional ARMA Models for Image Analysis.

    DTIC Science & Technology

    1980-03-24

    2-D ARMA models for image analysis . Particular emphasis is placed on restoration of noisy images using 2-D ARMA models. Computer results are...is concluded that the models are very effective linear models for image analysis . (Author)

  20. Cost-effectiveness Analysis in R Using a Multi-state Modeling Survival Analysis Framework: A Tutorial.

    PubMed

    Williams, Claire; Lewsey, James D; Briggs, Andrew H; Mackay, Daniel F

    2017-05-01

    This tutorial provides a step-by-step guide to performing cost-effectiveness analysis using a multi-state modeling approach. Alongside the tutorial, we provide easy-to-use functions in the statistics package R. We argue that this multi-state modeling approach using a package such as R has advantages over approaches where models are built in a spreadsheet package. In particular, using a syntax-based approach means there is a written record of what was done and the calculations are transparent. Reproducing the analysis is straightforward as the syntax just needs to be run again. The approach can be thought of as an alternative way to build a Markov decision-analytic model, which also has the option to use a state-arrival extended approach. In the state-arrival extended multi-state model, a covariate that represents patients' history is included, allowing the Markov property to be tested. We illustrate the building of multi-state survival models, making predictions from the models and assessing fits. We then proceed to perform a cost-effectiveness analysis, including deterministic and probabilistic sensitivity analyses. Finally, we show how to create 2 common methods of visualizing the results-namely, cost-effectiveness planes and cost-effectiveness acceptability curves. The analysis is implemented entirely within R. It is based on adaptions to functions in the existing R package mstate to accommodate parametric multi-state modeling that facilitates extrapolation of survival curves.

  1. Assessment of current state of the art in modeling techniques and analysis methods for large space structures

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1983-01-01

    Advances in continuum modeling, progress in reduction methods, and analysis and modeling needs for large space structures are covered with specific attention given to repetitive lattice trusses. As far as continuum modeling is concerned, an effective and verified analysis capability exists for linear thermoelastic stress, birfurcation buckling, and free vibration problems of repetitive lattices. However, application of continuum modeling to nonlinear analysis needs more development. Reduction methods are very effective for bifurcation buckling and static (steady-state) nonlinear analysis. However, more work is needed to realize their full potential for nonlinear dynamic and time-dependent problems. As far as analysis and modeling needs are concerned, three areas are identified: loads determination, modeling and nonclassical behavior characteristics, and computational algorithms. The impact of new advances in computer hardware, software, integrated analysis, CAD/CAM stems, and materials technology is also discussed.

  2. Semiparametric mixed-effects analysis of PK/PD models using differential equations.

    PubMed

    Wang, Yi; Eskridge, Kent M; Zhang, Shunpu

    2008-08-01

    Motivated by the use of semiparametric nonlinear mixed-effects modeling on longitudinal data, we develop a new semiparametric modeling approach to address potential structural model misspecification for population pharmacokinetic/pharmacodynamic (PK/PD) analysis. Specifically, we use a set of ordinary differential equations (ODEs) with form dx/dt = A(t)x + B(t) where B(t) is a nonparametric function that is estimated using penalized splines. The inclusion of a nonparametric function in the ODEs makes identification of structural model misspecification feasible by quantifying the model uncertainty and provides flexibility for accommodating possible structural model deficiencies. The resulting model will be implemented in a nonlinear mixed-effects modeling setup for population analysis. We illustrate the method with an application to cefamandole data and evaluate its performance through simulations.

  3. Applying a private sector capitation model to the management of type 2 diabetes in the South African public sector: a cost-effectiveness analysis.

    PubMed

    Volmink, Heinrich C; Bertram, Melanie Y; Jina, Ruxana; Wade, Alisha N; Hofman, Karen J

    2014-09-30

    Diabetes mellitus contributes substantially to the non-communicable disease burden in South Africa. The proposed National Health Insurance system provides an opportunity to consider the development of a cost-effective capitation model of care for patients with type 2 diabetes. The objective of the study was to determine the potential cost-effectiveness of adapting a private sector diabetes management programme (DMP) to the South African public sector. Cost-effectiveness analysis was undertaken with a public sector model of the DMP as the intervention and a usual practice model as the comparator. Probabilistic modelling was utilized for incremental cost-effectiveness ratio analysis with life years gained selected as the outcome. Secondary data were used to design the model while cost information was obtained from various sources, taking into account public sector billing. Modelling found an incremental cost-effectiveness ratio (ICER) of ZAR 8 356 (USD 1018) per life year gained (LYG) for the DMP against the usual practice model. This fell substantially below the Willingness-to-Pay threshold with bootstrapping analysis. Furthermore, a national implementation of the intervention could potentially result in an estimated cumulative gain of 96 997 years of life (95% CI 71 073 years - 113 994 years). Probabilistic modelling found the capitation intervention to be cost-effective, with an ICER of ZAR 8 356 (USD 1018) per LYG. Piloting the service within the public sector is recommended as an initial step, as this would provide data for more accurate economic evaluation, and would also allow for qualitative analysis of the programme.

  4. Guidance for the utility of linear models in meta-analysis of genetic association studies of binary phenotypes.

    PubMed

    Cook, James P; Mahajan, Anubha; Morris, Andrew P

    2017-02-01

    Linear mixed models are increasingly used for the analysis of genome-wide association studies (GWAS) of binary phenotypes because they can efficiently and robustly account for population stratification and relatedness through inclusion of random effects for a genetic relationship matrix. However, the utility of linear (mixed) models in the context of meta-analysis of GWAS of binary phenotypes has not been previously explored. In this investigation, we present simulations to compare the performance of linear and logistic regression models under alternative weighting schemes in a fixed-effects meta-analysis framework, considering designs that incorporate variable case-control imbalance, confounding factors and population stratification. Our results demonstrate that linear models can be used for meta-analysis of GWAS of binary phenotypes, without loss of power, even in the presence of extreme case-control imbalance, provided that one of the following schemes is used: (i) effective sample size weighting of Z-scores or (ii) inverse-variance weighting of allelic effect sizes after conversion onto the log-odds scale. Our conclusions thus provide essential recommendations for the development of robust protocols for meta-analysis of binary phenotypes with linear models.

  5. Modeling the effects of study abroad programs on college students

    Treesearch

    Alvin H. Yu; Garry E. Chick; Duarte B. Morais; Chung-Hsien Lin

    2009-01-01

    This study explored the possibility of modeling the effects of a study abroad program on students from a university in the northeastern United States. A program effect model was proposed after conducting an extensive literature review and empirically examining a sample of 265 participants in 2005. Exploratory factor analysis (EFA), confirmatory factor analysis (CFA),...

  6. Quantitative analysis of factors that affect oil pipeline network accident based on Bayesian networks: A case study in China

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Qin, Ting Xin; Huang, Shuai; Wu, Jian Song; Meng, Xin Yan

    2018-06-01

    Some factors can affect the consequences of oil pipeline accident and their effects should be analyzed to improve emergency preparation and emergency response. Although there are some qualitative analysis models of risk factors' effects, the quantitative analysis model still should be researched. In this study, we introduce a Bayesian network (BN) model of risk factors' effects analysis in an oil pipeline accident case that happened in China. The incident evolution diagram is built to identify the risk factors. And the BN model is built based on the deployment rule for factor nodes in BN and the expert knowledge by Dempster-Shafer evidence theory. Then the probabilities of incident consequences and risk factors' effects can be calculated. The most likely consequences given by this model are consilient with the case. Meanwhile, the quantitative estimations of risk factors' effects may provide a theoretical basis to take optimal risk treatment measures for oil pipeline management, which can be used in emergency preparation and emergency response.

  7. [How to fit and interpret multilevel models using SPSS].

    PubMed

    Pardo, Antonio; Ruiz, Miguel A; San Martín, Rafael

    2007-05-01

    Hierarchic or multilevel models are used to analyse data when cases belong to known groups and sample units are selected both from the individual level and from the group level. In this work, the multilevel models most commonly discussed in the statistic literature are described, explaining how to fit these models using the SPSS program (any version as of the 11 th ) and how to interpret the outcomes of the analysis. Five particular models are described, fitted, and interpreted: (1) one-way analysis of variance with random effects, (2) regression analysis with means-as-outcomes, (3) one-way analysis of covariance with random effects, (4) regression analysis with random coefficients, and (5) regression analysis with means- and slopes-as-outcomes. All models are explained, trying to make them understandable to researchers in health and behaviour sciences.

  8. Factor Analysis of Drawings: Application to College Student Models of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-01-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance,…

  9. DAMS: A Model to Assess Domino Effects by Using Agent-Based Modeling and Simulation.

    PubMed

    Zhang, Laobing; Landucci, Gabriele; Reniers, Genserik; Khakzad, Nima; Zhou, Jianfeng

    2017-12-19

    Historical data analysis shows that escalation accidents, so-called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent-based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent-based modeling technique explains the domino effects from a bottom-up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher-level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large-scale complicated cases. © 2017 Society for Risk Analysis.

  10. Evaluation of a Stratified National Breast Screening Program in the United Kingdom: An Early Model-Based Cost-Effectiveness Analysis.

    PubMed

    Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D Gareth; Astley, Sue; Payne, Katherine

    2017-09-01

    To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1, risk 2, masking [supplemental screening for women with higher breast density], and masking and risk 1) compared with the current UK NBSP and no screening. The model assumed a lifetime horizon, the health service perspective to identify costs (£, 2015), and measured consequences in quality-adjusted life-years (QALYs). Multiple data sources were used: systematic reviews of effectiveness and utility, published studies reporting costs, and cohort studies embedded in existing NBSPs. Model parameter uncertainty was assessed using probabilistic sensitivity analysis and one-way sensitivity analysis. The base-case analysis, supported by probabilistic sensitivity analysis, suggested that the risk stratified NBSPs (risk 1 and risk-2) were relatively cost-effective when compared with the current UK NBSP, with incremental cost-effectiveness ratios of £16,689 per QALY and £23,924 per QALY, respectively. Stratified NBSP including masking approaches (supplemental screening for women with higher breast density) was not a cost-effective alternative, with incremental cost-effectiveness ratios of £212,947 per QALY (masking) and £75,254 per QALY (risk 1 and masking). When compared with no screening, all stratified NBSPs could be considered cost-effective. Key drivers of cost-effectiveness were discount rate, natural history model parameters, mammographic sensitivity, and biopsy rates for recalled cases. A key assumption was that the risk model used in the stratification process was perfectly calibrated to the population. This early model-based cost-effectiveness analysis provides indicative evidence for decision makers to understand the key drivers of costs and QALYs for exemplar stratified NBSP. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Analysis of mortality data from the former USSR: age-period-cohort analysis.

    PubMed

    Willekens, F; Scherbov, S

    1992-01-01

    The objective of this article is to review research on age-period-cohort (APC) analysis of mortality and to trace the effects of contemporary and historical factors on mortality change in the former USSR. Several events in USSR history have exerted a lasting influence on its people. These influences may be captured by an APC model in which the period effects measure the impact of contemporary factors and the cohort effects the past history of individuals which cannot be attributed to age or stage in the life cycle. APC models are extensively applied in the study of mortality. This article presents the statistical theory of the APC models and shows that they belong to the family of generalized linear models. The parameters of the APC model may therefore be estimated by any package of loglinear analysis that allows for hybrid loglinear models.

  12. Using global sensitivity analysis of demographic models for ecological impact assessment.

    PubMed

    Aiello-Lammens, Matthew E; Akçakaya, H Resit

    2017-02-01

    Population viability analysis (PVA) is widely used to assess population-level impacts of environmental changes on species. When combined with sensitivity analysis, PVA yields insights into the effects of parameter and model structure uncertainty. This helps researchers prioritize efforts for further data collection so that model improvements are efficient and helps managers prioritize conservation and management actions. Usually, sensitivity is analyzed by varying one input parameter at a time and observing the influence that variation has over model outcomes. This approach does not account for interactions among parameters. Global sensitivity analysis (GSA) overcomes this limitation by varying several model inputs simultaneously. Then, regression techniques allow measuring the importance of input-parameter uncertainties. In many conservation applications, the goal of demographic modeling is to assess how different scenarios of impact or management cause changes in a population. This is challenging because the uncertainty of input-parameter values can be confounded with the effect of impacts and management actions. We developed a GSA method that separates model outcome uncertainty resulting from parameter uncertainty from that resulting from projected ecological impacts or simulated management actions, effectively separating the 2 main questions that sensitivity analysis asks. We applied this method to assess the effects of predicted sea-level rise on Snowy Plover (Charadrius nivosus). A relatively small number of replicate models (approximately 100) resulted in consistent measures of variable importance when not trying to separate the effects of ecological impacts from parameter uncertainty. However, many more replicate models (approximately 500) were required to separate these effects. These differences are important to consider when using demographic models to estimate ecological impacts of management actions. © 2016 Society for Conservation Biology.

  13. Random effects coefficient of determination for mixed and meta-analysis models

    PubMed Central

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2011-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, Rr2, that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If Rr2 is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of Rr2 apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects—the model can be estimated using the dummy variable approach. We derive explicit formulas for Rr2 in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine. PMID:23750070

  14. Identifying key sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment.

    PubMed

    Sweetapple, Christine; Fu, Guangtao; Butler, David

    2013-09-01

    This study investigates sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment, through the use of local and global sensitivity analysis tools, and contributes to an in-depth understanding of wastewater treatment modelling by revealing critical parameters and parameter interactions. One-factor-at-a-time sensitivity analysis is used to screen model parameters and identify those with significant individual effects on three performance indicators: total greenhouse gas emissions, effluent quality and operational cost. Sobol's method enables identification of parameters with significant higher order effects and of particular parameter pairs to which model outputs are sensitive. Use of a variance-based global sensitivity analysis tool to investigate parameter interactions enables identification of important parameters not revealed in one-factor-at-a-time sensitivity analysis. These interaction effects have not been considered in previous studies and thus provide a better understanding wastewater treatment plant model characterisation. It was found that uncertainty in modelled nitrous oxide emissions is the primary contributor to uncertainty in total greenhouse gas emissions, due largely to the interaction effects of three nitrogen conversion modelling parameters. The higher order effects of these parameters are also shown to be a key source of uncertainty in effluent quality. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Likelihood-Based Random-Effect Meta-Analysis of Binary Events.

    PubMed

    Amatya, Anup; Bhaumik, Dulal K; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D

    2015-01-01

    Meta-analysis has been used extensively for evaluation of efficacy and safety of medical interventions. Its advantages and utilities are well known. However, recent studies have raised questions about the accuracy of the commonly used moment-based meta-analytic methods in general and for rare binary outcomes in particular. The issue is further complicated for studies with heterogeneous effect sizes. Likelihood-based mixed-effects modeling provides an alternative to moment-based methods such as inverse-variance weighted fixed- and random-effects estimators. In this article, we compare and contrast different mixed-effect modeling strategies in the context of meta-analysis. Their performance in estimation and testing of overall effect and heterogeneity are evaluated when combining results from studies with a binary outcome. Models that allow heterogeneity in both baseline rate and treatment effect across studies have low type I and type II error rates, and their estimates are the least biased among the models considered.

  16. Accounting for heterogeneity in meta-analysis using a multiplicative model-an empirical study.

    PubMed

    Mawdsley, David; Higgins, Julian P T; Sutton, Alex J; Abrams, Keith R

    2017-03-01

    In meta-analysis, the random-effects model is often used to account for heterogeneity. The model assumes that heterogeneity has an additive effect on the variance of effect sizes. An alternative model, which assumes multiplicative heterogeneity, has been little used in the medical statistics community, but is widely used by particle physicists. In this paper, we compare the two models using a random sample of 448 meta-analyses drawn from the Cochrane Database of Systematic Reviews. In general, differences in goodness of fit are modest. The multiplicative model tends to give results that are closer to the null, with a narrower confidence interval. Both approaches make different assumptions about the outcome of the meta-analysis. In our opinion, the selection of the more appropriate model will often be guided by whether the multiplicative model's assumption of a single effect size is plausible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Accounting for standard errors of vision-specific latent trait in regression models.

    PubMed

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  18. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence ofmore » dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.« less

  20. The effects of videotape modeling on staff acquisition of functional analysis methodology.

    PubMed

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape.

  1. The Effects of Videotape Modeling on Staff Acquisition of Functional Analysis Methodology

    PubMed Central

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape. PMID:17471805

  2. A general framework for the use of logistic regression models in meta-analysis.

    PubMed

    Simmonds, Mark C; Higgins, Julian Pt

    2016-12-01

    Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.

  3. The effectiveness of problem-based learning on students’ problem solving ability in vector analysis course

    NASA Astrophysics Data System (ADS)

    Mushlihuddin, R.; Nurafifah; Irvan

    2018-01-01

    The student’s low ability in mathematics problem solving proved to the less effective of a learning process in the classroom. Effective learning was a learning that affects student’s math skills, one of which is problem-solving abilities. Problem-solving capability consisted of several stages: understanding the problem, planning the settlement, solving the problem as planned, re-examining the procedure and the outcome. The purpose of this research was to know: (1) was there any influence of PBL model in improving ability Problem solving of student math in a subject of vector analysis?; (2) was the PBL model effective in improving students’ mathematical problem-solving skills in vector analysis courses? This research was a quasi-experiment research. The data analysis techniques performed from the test stages of data description, a prerequisite test is the normality test, and hypothesis test using the ANCOVA test and Gain test. The results showed that: (1) there was an influence of PBL model in improving students’ math problem-solving abilities in vector analysis courses; (2) the PBL model was effective in improving students’ problem-solving skills in vector analysis courses with a medium category.

  4. Using multiple group modeling to test moderators in meta-analysis.

    PubMed

    Schoemann, Alexander M

    2016-12-01

    Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Random effects coefficient of determination for mixed and meta-analysis models.

    PubMed

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2012-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, [Formula: see text], that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If [Formula: see text] is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of [Formula: see text] apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects-the model can be estimated using the dummy variable approach. We derive explicit formulas for [Formula: see text] in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine.

  6. Two new methods to fit models for network meta-analysis with random inconsistency effects.

    PubMed

    Law, Martin; Jackson, Dan; Turner, Rebecca; Rhodes, Kirsty; Viechtbauer, Wolfgang

    2016-07-28

    Meta-analysis is a valuable tool for combining evidence from multiple studies. Network meta-analysis is becoming more widely used as a means to compare multiple treatments in the same analysis. However, a network meta-analysis may exhibit inconsistency, whereby the treatment effect estimates do not agree across all trial designs, even after taking between-study heterogeneity into account. We propose two new estimation methods for network meta-analysis models with random inconsistency effects. The model we consider is an extension of the conventional random-effects model for meta-analysis to the network meta-analysis setting and allows for potential inconsistency using random inconsistency effects. Our first new estimation method uses a Bayesian framework with empirically-based prior distributions for both the heterogeneity and the inconsistency variances. We fit the model using importance sampling and thereby avoid some of the difficulties that might be associated with using Markov Chain Monte Carlo (MCMC). However, we confirm the accuracy of our importance sampling method by comparing the results to those obtained using MCMC as the gold standard. The second new estimation method we describe uses a likelihood-based approach, implemented in the metafor package, which can be used to obtain (restricted) maximum-likelihood estimates of the model parameters and profile likelihood confidence intervals of the variance components. We illustrate the application of the methods using two contrasting examples. The first uses all-cause mortality as an outcome, and shows little evidence of between-study heterogeneity or inconsistency. The second uses "ear discharge" as an outcome, and exhibits substantial between-study heterogeneity and inconsistency. Both new estimation methods give results similar to those obtained using MCMC. The extent of heterogeneity and inconsistency should be assessed and reported in any network meta-analysis. Our two new methods can be used to fit models for network meta-analysis with random inconsistency effects. They are easily implemented using the accompanying R code in the Additional file 1. Using these estimation methods, the extent of inconsistency can be assessed and reported.

  7. Assessment of environmental impacts part one. Intervention analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hipel, Keith William; Lettenmaier, Dennis P.; McLeod, A. Ian

    The use of intervention analysis as a statistical method of gauging the effects of environmental changes is discussed. The Box-Jenkins model, serves as the basis for the intervention analysis methodology. Environmental studies of the Aswan Dam, the South Saskatchewan River, and a forest fire near the Pipers Hole River, Canada, are included as case studies in which intervention analysis was employed. Methods of data collection for intervention analysis are found to have a significant impact on model reliability; effective data collection processes for the Box-Jenkins model are provided. (15 graphs, 27 references, 2 tables)

  8. Cost-effectiveness of unicondylar versus total knee arthroplasty: a Markov model analysis.

    PubMed

    Peersman, Geert; Jak, Wouter; Vandenlangenbergh, Tom; Jans, Christophe; Cartier, Philippe; Fennema, Peter

    2014-01-01

    Unicondylar knee arthroplasty (UKA) is believed to lead to less morbidity and enhanced functional outcomes when compared with total knee arthroplasty (TKA). Conversely, UKA is also associated with a higher revision risk than TKA. In order to further clarify the key differences between these separate procedures, the current study assessing the cost-effectiveness of UKA versus TKA was undertaken. A state-transition Markov model was developed to compare the cost-effectiveness of UKA versus TKA for unicondylar osteoarthritis using a Belgian payer's perspective. The model was designed to include the possibility of two revision procedures. Model estimates were obtained through literature review and revision rates were based on registry data. Threshold analysis and probabilistic sensitivity analysis were performed to assess the model's robustness. UKA was associated with a cost reduction of €2,807 and a utility gain of 0.04 quality-adjusted life years in comparison with TKA. Analysis determined that the model is sensitive to clinical effectiveness, and that a marginal reduction in the clinical performance of UKA would lead to TKA being the more cost-effective solution. UKA yields clear advantages in terms of costs and marginal advantages in terms of health effects, in comparison with TKA. © 2014 Elsevier B.V. All rights reserved.

  9. The concept of shared mental models in healthcare collaboration.

    PubMed

    McComb, Sara; Simpson, Vicki

    2014-07-01

    To report an analysis of the concept of shared mental models in health care. Shared mental models have been described as facilitators of effective teamwork. The complexity and criticality of the current healthcare system requires shared mental models to enhance safe and effective patient/client care. Yet, the current concept definition in the healthcare literature is vague and, therefore, difficult to apply consistently in research and practice. Concept analysis. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed and MEDLINE (EBSCO Interface), for the years 1997-2013. Walker and Avant's approach to concept analysis was employed and, following Paley's guidance, embedded in extant theory from the team literature. Although teamwork and collaboration are discussed frequently in healthcare literature, the concept of shared mental models in that context is not as commonly found but is increasing in appearance. Our concept analysis defines shared mental models as individually held knowledge structures that help team members function collaboratively in their environments and are comprised of the attributes of content, similarity, accuracy and dynamics. This theoretically grounded concept analysis provides a foundation for a middle-range descriptive theory of shared mental models in nursing and health care. Further research concerning the impact of shared mental models in the healthcare setting can result in development and refinement of shared mental models to support effective teamwork and collaboration. © 2013 John Wiley & Sons Ltd.

  10. Models for Analyzing Environmental Issues in the Classroom.

    ERIC Educational Resources Information Center

    Chiras, Daniel D.

    1980-01-01

    Presents several conceptual models dealing with issues in environmental education. Models described are: (1) Population/Resources/Pollution, (2) Cause-and-Effect Analysis, and (3) Ethical Analysis. (CS)

  11. Factor Analysis of Drawings: Application to college student models of the greenhouse effect

    NASA Astrophysics Data System (ADS)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-09-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance, suggesting that 4 archetype models of the greenhouse effect dominate thinking within this population. Factor scores, indicating the extent to which each student's drawing aligned with representative models, were compared to performance on conceptual understanding and attitudes measures, demographics, and non-cognitive features of drawings. Student drawings were also compared to drawings made by scientists to ascertain the extent to which models reflect more sophisticated and accurate models. Results indicate that student and scientist drawings share some similarities, most notably the presence of some features of the most sophisticated non-scientific model held among the study population. Prior knowledge, prior attitudes, gender, and non-cognitive components are also predictive of an individual student's model. This work presents a new technique for analyzing drawings, with general implications for the use of drawings in investigating student conceptions.

  12. Analysis of genetic effects of nuclear-cytoplasmic interaction on quantitative traits: genetic model for diploid plants.

    PubMed

    Han, Lide; Yang, Jian; Zhu, Jun

    2007-06-01

    A genetic model was proposed for simultaneously analyzing genetic effects of nuclear, cytoplasm, and nuclear-cytoplasmic interaction (NCI) as well as their genotype by environment (GE) interaction for quantitative traits of diploid plants. In the model, the NCI effects were further partitioned into additive and dominance nuclear-cytoplasmic interaction components. Mixed linear model approaches were used for statistical analysis. On the basis of diallel cross designs, Monte Carlo simulations showed that the genetic model was robust for estimating variance components under several situations without specific effects. Random genetic effects were predicted by an adjusted unbiased prediction (AUP) method. Data on four quantitative traits (boll number, lint percentage, fiber length, and micronaire) in Upland cotton (Gossypium hirsutum L.) were analyzed as a worked example to show the effectiveness of the model.

  13. The Effect on Non-Normal Distributions on the Integrated Moving Average Model of Time-Series Analysis.

    ERIC Educational Resources Information Center

    Doerann-George, Judith

    The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…

  14. The choice of prior distribution for a covariance matrix in multivariate meta-analysis: a simulation study.

    PubMed

    Hurtado Rúa, Sandra M; Mazumdar, Madhu; Strawderman, Robert L

    2015-12-30

    Bayesian meta-analysis is an increasingly important component of clinical research, with multivariate meta-analysis a promising tool for studies with multiple endpoints. Model assumptions, including the choice of priors, are crucial aspects of multivariate Bayesian meta-analysis (MBMA) models. In a given model, two different prior distributions can lead to different inferences about a particular parameter. A simulation study was performed in which the impact of families of prior distributions for the covariance matrix of a multivariate normal random effects MBMA model was analyzed. Inferences about effect sizes were not particularly sensitive to prior choice, but the related covariance estimates were. A few families of prior distributions with small relative biases, tight mean squared errors, and close to nominal coverage for the effect size estimates were identified. Our results demonstrate the need for sensitivity analysis and suggest some guidelines for choosing prior distributions in this class of problems. The MBMA models proposed here are illustrated in a small meta-analysis example from the periodontal field and a medium meta-analysis from the study of stroke. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Integrated Modeling Activities for the James Webb Space Telescope (JWST): Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.

    2004-01-01

    This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.

  16. Incorporating Psychological Predictors of Treatment Response into Health Economic Simulation Models: A Case Study in Type 1 Diabetes.

    PubMed

    Kruger, Jen; Pollard, Daniel; Basarir, Hasan; Thokala, Praveen; Cooke, Debbie; Clark, Marie; Bond, Rod; Heller, Simon; Brennan, Alan

    2015-10-01

    . Health economic modeling has paid limited attention to the effects that patients' psychological characteristics have on the effectiveness of treatments. This case study tests 1) the feasibility of incorporating psychological prediction models of treatment response within an economic model of type 1 diabetes, 2) the potential value of providing treatment to a subgroup of patients, and 3) the cost-effectiveness of providing treatment to a subgroup of responders defined using 5 different algorithms. . Multiple linear regressions were used to investigate relationships between patients' psychological characteristics and treatment effectiveness. Two psychological prediction models were integrated with a patient-level simulation model of type 1 diabetes. Expected value of individualized care analysis was undertaken. Five different algorithms were used to provide treatment to a subgroup of predicted responders. A cost-effectiveness analysis compared using the algorithms to providing treatment to all patients. . The psychological prediction models had low predictive power for treatment effectiveness. Expected value of individualized care results suggested that targeting education at responders could be of value. The cost-effectiveness analysis suggested, for all 5 algorithms, that providing structured education to a subgroup of predicted responders would not be cost-effective. . The psychological prediction models tested did not have sufficient predictive power to make targeting treatment cost-effective. The psychological prediction models are simple linear models of psychological behavior. Collection of data on additional covariates could potentially increase statistical power. . By collecting data on psychological variables before an intervention, we can construct predictive models of treatment response to interventions. These predictive models can be incorporated into health economic models to investigate more complex service delivery and reimbursement strategies. © The Author(s) 2015.

  17. Computational fluid dynamic modelling of cavitation

    NASA Technical Reports Server (NTRS)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  18. Cost-utility of quadrivalent versus trivalent influenza vaccine in Brazil - comparison of outcomes from different static model types.

    PubMed

    Van Bellinghen, Laure-Anne; Marijam, Alen; Tannus Branco de Araujo, Gabriela; Gomez, Jorge; Van Vlaenderen, Ilse

    Influenza burden in Brazil is considerable with 4.2-6.4 million cases in 2008 and influenza-like-illness responsible for 16.9% of hospitalizations. Cost-effectiveness of influenza vaccination may be assessed by different types of models, with limitations due to data availability, assumptions, and modelling approach. To understand the impact of model complexity, the cost-utility of quadrivalent versus trivalent influenza vaccines in Brazil was estimated using three distinct models: a 1-year decision tree population model with three age groups (FLOU); a more detailed 1-year population model with five age groups (FLORA); and a more complex lifetime multi-cohort Markov model with nine age groups (FLORENCE). Analysis 1 (impact of model structure) compared each model using the same data inputs (i.e., best available data for FLOU). Analysis 2 (impact of increasing granularity) compared each model populated with the best available data for that model. Using the best data for each model, the discounted cost-utility ratio of quadrivalent versus trivalent influenza vaccine was R$20,428 with FLOU, R$22,768 with FLORA (versus R$20,428 in Analysis 1), and, R$19,257 with FLORENCE (versus R$22,490 in Analysis 1) using a lifetime horizon. Conceptual differences between FLORA and FLORENCE meant the same assumption regarding increased all-cause mortality in at-risk individuals had an opposite effect on the incremental cost-effectiveness ratio in Analysis 2 versus 1, and a proportionally higher number of vaccinated elderly in FLORENCE reduced this ratio in Analysis 2. FLOU provided adequate cost-effectiveness estimates with data in broad age groups. FLORA increased insights (e.g., in healthy versus at-risk, paediatric, respiratory/non-respiratory complications). FLORENCE provided greater insights and precision (e.g., in elderly, costs and complications, lifetime cost-effectiveness). All three models predicted a cost per quality-adjusted life year gained for quadrivalent versus trivalent influenza vaccine in the range of R$19,257 (FLORENCE) to R$22,768 (FLORA) with the best available data in Brazil (Appendix A). Copyright © 2018 Sociedade Brasileira de Infectologia. Published by Elsevier Editora Ltda. All rights reserved.

  19. Random-Effects Models for Meta-Analytic Structural Equation Modeling: Review, Issues, and Illustrations

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.; Cheung, Shu Fai

    2016-01-01

    Meta-analytic structural equation modeling (MASEM) combines the techniques of meta-analysis and structural equation modeling for the purpose of synthesizing correlation or covariance matrices and fitting structural equation models on the pooled correlation or covariance matrix. Both fixed-effects and random-effects models can be defined in MASEM.…

  20. Comparison of Two Global Sensitivity Analysis Methods for Hydrologic Modeling over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Hameed, M.; Demirel, M. C.; Moradkhani, H.

    2015-12-01

    Global Sensitivity Analysis (GSA) approach helps identify the effectiveness of model parameters or inputs and thus provides essential information about the model performance. In this study, the effects of the Sacramento Soil Moisture Accounting (SAC-SMA) model parameters, forcing data, and initial conditions are analysed by using two GSA methods: Sobol' and Fourier Amplitude Sensitivity Test (FAST). The simulations are carried out over five sub-basins within the Columbia River Basin (CRB) for three different periods: one-year, four-year, and seven-year. Four factors are considered and evaluated by using the two sensitivity analysis methods: the simulation length, parameter range, model initial conditions, and the reliability of the global sensitivity analysis methods. The reliability of the sensitivity analysis results is compared based on 1) the agreement between the two sensitivity analysis methods (Sobol' and FAST) in terms of highlighting the same parameters or input as the most influential parameters or input and 2) how the methods are cohered in ranking these sensitive parameters under the same conditions (sub-basins and simulation length). The results show the coherence between the Sobol' and FAST sensitivity analysis methods. Additionally, it is found that FAST method is sufficient to evaluate the main effects of the model parameters and inputs. Another conclusion of this study is that the smaller parameter or initial condition ranges, the more consistency and coherence between the sensitivity analysis methods results.

  1. Increasing Effectiveness in Teaching Ethics to Undergraduate Business Students.

    ERIC Educational Resources Information Center

    Lampe, Marc

    1997-01-01

    Traditional approaches to teaching business ethics (philosophical analysis, moral quandaries, executive cases) may not be effective in persuading undergraduates of the importance of ethical behavior. Better techniques include values education, ethical decision-making models, analysis of ethical conflicts, and role modeling. (SK)

  2. [Variable selection methods combined with local linear embedding theory used for optimization of near infrared spectral quantitative models].

    PubMed

    Hao, Yong; Sun, Xu-Dong; Yang, Qiang

    2012-12-01

    Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.

  3. A General Model for Estimating and Correcting the Effects of Nonindependence in Meta-Analysis.

    ERIC Educational Resources Information Center

    Strube, Michael J.

    A general model is described which can be used to represent the four common types of meta-analysis: (1) estimation of effect size by combining study outcomes; (2) estimation of effect size by contrasting study outcomes; (3) estimation of statistical significance by combining study outcomes; and (4) estimation of statistical significance by…

  4. Bayesian inference on risk differences: an application to multivariate meta-analysis of adverse events in clinical trials.

    PubMed

    Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng

    2013-05-01

    Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.

  5. Instantiating the art of war for effects-based operations

    NASA Astrophysics Data System (ADS)

    Burns, Carla L.

    2002-07-01

    Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.

  6. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data.

    PubMed

    Ying, Gui-Shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-04-01

    To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field in the elderly. When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI -0.03 to 0.32D, p = 0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, p = 0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller p-values, while analysis of the worse eye provided larger p-values than mixed effects models and marginal models. In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision.

  7. The spatial impact of neighbouring on the exports activities of COMESA countries by using spatial panel models

    NASA Astrophysics Data System (ADS)

    Hamzalouh, L.; Ismail, M. T.; Rahman, R. A.

    2017-09-01

    In this paper, spatial panel models were used and the method for selecting the best model amongst the spatial fixed effects model and the spatial random effects model to estimate the fitting model by using the robust Hausman test for analysis of the exports pattern of the Common Market for Eastern and Southern African (COMESA) countries. And examine the effects of the interactions of the economic statistic of explanatory variables on the exports of the COMESA. Results indicated that the spatial Durbin model with fixed effects specification should be tested and considered in most cases of this study. After that, the direct and indirect effects among COMESA regions were assessed, and the role of indirect spatial effects in estimating exports was empirically demonstrated. Regarding originality and research value, and to the best of the authors’ knowledge, this is the first attempt to examine exports between COMESA and its member countries through spatial panel models using XSMLE, which is a new command for spatial analysis using STATA.

  8. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    PubMed

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. A Pilot/Vehicle Model Analysis of the Effects of Motion Cues on Harrier Control Tasks.

    DTIC Science & Technology

    1983-09-01

    7 D- R136 291 A PILOT/VEHILE MODEL ANALYSIS OF THE EFFECTS OF MOTION i/i LS 91 CUES ON HARRIER C..(U) BOLT BERANEK AND NEWMAN INC CAMBRIDGE MA S...provided by well-designed platform motion systems , the actual rovement of performance or training effectiveness that results from incorporating these...for the Harrier AV-8B. The effects of providing motion cues via an idealized platform motion system or a g-seat device are predicted with the model, and

  10. From global to local: exploring the relationship between parameters and behaviors in models of electrical excitability.

    PubMed

    Fletcher, Patrick; Bertram, Richard; Tabak, Joel

    2016-06-01

    Models of electrical activity in excitable cells involve nonlinear interactions between many ionic currents. Changing parameters in these models can produce a variety of activity patterns with sometimes unexpected effects. Further more, introducing new currents will have different effects depending on the initial parameter set. In this study we combined global sampling of parameter space and local analysis of representative parameter sets in a pituitary cell model to understand the effects of adding K (+) conductances, which mediate some effects of hormone action on these cells. Global sampling ensured that the effects of introducing K (+) conductances were captured across a wide variety of contexts of model parameters. For each type of K (+) conductance we determined the types of behavioral transition that it evoked. Some transitions were counterintuitive, and may have been missed without the use of global sampling. In general, the wide range of transitions that occurred when the same current was applied to the model cell at different locations in parameter space highlight the challenge of making accurate model predictions in light of cell-to-cell heterogeneity. Finally, we used bifurcation analysis and fast/slow analysis to investigate why specific transitions occur in representative individual models. This approach relies on the use of a graphics processing unit (GPU) to quickly map parameter space to model behavior and identify parameter sets for further analysis. Acceleration with modern low-cost GPUs is particularly well suited to exploring the moderate-sized (5-20) parameter spaces of excitable cell and signaling models.

  11. A Field-Effect Transistor (FET) model for ASAP

    NASA Technical Reports Server (NTRS)

    Ming, L.

    1965-01-01

    The derivation of the circuitry of a field effect transistor (FET) model, the procedure for adapting the model to automated statistical analysis program (ASAP), and the results of applying ASAP on this model are described.

  12. Penetration analysis of projectile with inclined concrete target

    NASA Astrophysics Data System (ADS)

    Kim, S. B.; Kim, H. W.; Yoo, Y. H.

    2015-09-01

    This paper presents numerical analysis result of projectile penetration with concrete target. We applied dynamic material properties of 4340 steels, aluminium and explosive for projectile body. Dynamic material properties were measured with static tensile testing machine and Hopkinson pressure bar tests. Moreover, we used three concrete damage models included in LS-DYNA 3D, such as SOIL_CONCRETE, CSCM (cap model with smooth interaction) and CONCRETE_DAMAGE (K&C concrete) models. Strain rate effect for concrete material is important to predict the fracture deformation and shape of concrete, and penetration depth for projectiles. CONCRETE_DAMAGE model with strain rate effect also applied to penetration analysis. Analysis result with CSCM model shows good agreement with penetration experimental data. The projectile trace and fracture shapes of concrete target were compared with experimental data.

  13. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    PubMed Central

    Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs

    2018-01-01

    Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344

  14. A hierarchical model for regional analysis of population change using Christmas Bird Count data, with application to the American Black Duck

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.; Niven, D.K.

    2006-01-01

    Analysis of Christmas Bird Count (CBC) data is complicated by the need to account for variation in effort on counts and to provide summaries over large geographic regions. We describe a hierarchical model for analysis of population change using CBC data that addresses these needs. The effect of effort is modeled parametrically, with parameter values varying among strata as identically distributed random effects. Year and site effects are modeled hierarchically, accommodating large regional variation in number of samples and precision of estimates. The resulting model is complex, but a Bayesian analysis can be conducted using Markov chain Monte Carlo techniques. We analyze CBC data for American Black Ducks (Anas rubripes), a species of considerable management interest that has historically been monitored using winter surveys. Over the interval 1966-2003, Black Duck populations showed distinct regional patterns of population change. The patterns shown by CBC data are similar to those shown by the Midwinter Waterfowl Inventory for the United States.

  15. Parameter Estimation of Actuators for Benchmark Active Control Technology (BACT) Wind Tunnel Model with Analysis of Wear and Aerodynamic Loading Effects

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Fung, Jimmy

    1998-01-01

    This report describes the development of transfer function models for the trailing-edge and upper and lower spoiler actuators of the Benchmark Active Control Technology (BACT) wind tunnel model for application to control system analysis and design. A simple nonlinear least-squares parameter estimation approach is applied to determine transfer function parameters from frequency response data. Unconstrained quasi-Newton minimization of weighted frequency response error was employed to estimate the transfer function parameters. An analysis of the behavior of the actuators over time to assess the effects of wear and aerodynamic load by using the transfer function models is also presented. The frequency responses indicate consistent actuator behavior throughout the wind tunnel test and only slight degradation in effectiveness due to aerodynamic hinge loading. The resulting actuator models have been used in design, analysis, and simulation of controllers for the BACT to successfully suppress flutter over a wide range of conditions.

  16. A kinematic model to assess spinal motion during walking.

    PubMed

    Konz, Regina J; Fatone, Stefania; Stine, Rebecca L; Ganju, Aruna; Gard, Steven A; Ondra, Stephen L

    2006-11-15

    A 3-dimensional multi-segment kinematic spine model was developed for noninvasive analysis of spinal motion during walking. Preliminary data from able-bodied ambulators were collected and analyzed using the model. Neither the spine's role during walking nor the effect of surgical spinal stabilization on gait is fully understood. Typically, gait analysis models disregard the spine entirely or regard it as a single rigid structure. Data on regional spinal movements, in conjunction with lower limb data, associated with walking are scarce. KinTrak software (Motion Analysis Corp., Santa Rosa, CA) was used to create a biomechanical model for analysis of 3-dimensional regional spinal movements. Measuring known angles from a mechanical model and comparing them to the calculated angles validated the kinematic model. Spine motion data were collected from 10 able-bodied adults walking at 5 self-selected speeds. These results were compared to data reported in the literature. The uniaxial angles measured on the mechanical model were within 5 degrees of the calculated kinematic model angles, and the coupled angles were within 2 degrees. Regional spine kinematics from able-bodied subjects calculated with this model compared well to data reported by other authors. A multi-segment kinematic spine model has been developed and validated for analysis of spinal motion during walking. By understanding the spine's role during ambulation and the cause-and-effect relationship between spine motion and lower limb motion, preoperative planning may be augmented to restore normal alignment and balance with minimal negative effects on walking.

  17. MIXOR: a computer program for mixed-effects ordinal regression analysis.

    PubMed

    Hedeker, D; Gibbons, R D

    1996-03-01

    MIXOR provides maximum marginal likelihood estimates for mixed-effects ordinal probit, logistic, and complementary log-log regression models. These models can be used for analysis of dichotomous and ordinal outcomes from either a clustered or longitudinal design. For clustered data, the mixed-effects model assumes that data within clusters are dependent. The degree of dependency is jointly estimated with the usual model parameters, thus adjusting for dependence resulting from clustering of the data. Similarly, for longitudinal data, the mixed-effects approach can allow for individual-varying intercepts and slopes across time, and can estimate the degree to which these time-related effects vary in the population of individuals. MIXOR uses marginal maximum likelihood estimation, utilizing a Fisher-scoring solution. For the scoring solution, the Cholesky factor of the random-effects variance-covariance matrix is estimated, along with the effects of model covariates. Examples illustrating usage and features of MIXOR are provided.

  18. Methods for selecting fixed-effect models for heterogeneous codon evolution, with comments on their application to gene and genome data.

    PubMed

    Bao, Le; Gu, Hong; Dunn, Katherine A; Bielawski, Joseph P

    2007-02-08

    Models of codon evolution have proven useful for investigating the strength and direction of natural selection. In some cases, a priori biological knowledge has been used successfully to model heterogeneous evolutionary dynamics among codon sites. These are called fixed-effect models, and they require that all codon sites are assigned to one of several partitions which are permitted to have independent parameters for selection pressure, evolutionary rate, transition to transversion ratio or codon frequencies. For single gene analysis, partitions might be defined according to protein tertiary structure, and for multiple gene analysis partitions might be defined according to a gene's functional category. Given a set of related fixed-effect models, the task of selecting the model that best fits the data is not trivial. In this study, we implement a set of fixed-effect codon models which allow for different levels of heterogeneity among partitions in the substitution process. We describe strategies for selecting among these models by a backward elimination procedure, Akaike information criterion (AIC) or a corrected Akaike information criterion (AICc). We evaluate the performance of these model selection methods via a simulation study, and make several recommendations for real data analysis. Our simulation study indicates that the backward elimination procedure can provide a reliable method for model selection in this setting. We also demonstrate the utility of these models by application to a single-gene dataset partitioned according to tertiary structure (abalone sperm lysin), and a multi-gene dataset partitioned according to the functional category of the gene (flagellar-related proteins of Listeria). Fixed-effect models have advantages and disadvantages. Fixed-effect models are desirable when data partitions are known to exhibit significant heterogeneity or when a statistical test of such heterogeneity is desired. They have the disadvantage of requiring a priori knowledge for partitioning sites. We recommend: (i) selection of models by using backward elimination rather than AIC or AICc, (ii) use a stringent cut-off, e.g., p = 0.0001, and (iii) conduct sensitivity analysis of results. With thoughtful application, fixed-effect codon models should provide a useful tool for large scale multi-gene analyses.

  19. Statistical model to perform error analysis of curve fits of wind tunnel test data using the techniques of analysis of variance and regression analysis

    NASA Technical Reports Server (NTRS)

    Alston, D. W.

    1981-01-01

    The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.

  20. Comparison of composite rotor blade models: A coupled-beam analysis and an MSC/NASTRAN finite-element model

    NASA Technical Reports Server (NTRS)

    Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.

    1987-01-01

    A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.

  1. A Note on Cluster Effects in Latent Class Analysis

    ERIC Educational Resources Information Center

    Kaplan, David; Keller, Bryan

    2011-01-01

    This article examines the effects of clustering in latent class analysis. A comprehensive simulation study is conducted, which begins by specifying a true multilevel latent class model with varying within- and between-cluster sample sizes, varying latent class proportions, and varying intraclass correlations. These models are then estimated under…

  2. Analysis of modeling cumulative noise from simultaneous flights volume 2 : supplemental analysis

    DOT National Transportation Integrated Search

    2012-12-31

    This is the second of two volumes of the report on modeling cumulative noise from simultaneous flights. This volume examines the effect of several modeling input cases on Percent Time Audible results calculated by the Integrated Noise Model. The case...

  3. Integrated Modeling Activities for the James Webb Space Telescope: Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Howard, Joseph M.; Mosier, Gary E.; Parrish, Keith A.; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal-optical, often referred to as STOP, analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. Temperatures predicted using geometric and thermal math models are mapped to a structural finite element model in order to predict thermally induced deformations. Motions and deformations at optical surfaces are then input to optical models, and optical performance is predicted using either an optical ray trace or a linear optical analysis tool. In addition to baseline performance predictions, a process for performing sensitivity studies to assess modeling uncertainties is described.

  4. Pilot-model analysis and simulation study of effect of control task desired control response

    NASA Technical Reports Server (NTRS)

    Adams, J. J.; Gera, J.; Jaudon, J. B.

    1978-01-01

    A pilot model analysis was performed that relates pilot control compensation, pilot aircraft system response, and aircraft response characteristics for longitudinal control. The results show that a higher aircraft short period frequency is required to achieve superior pilot aircraft system response in an altitude control task than is required in an attitude control task. These results were confirmed by a simulation study of target tracking. It was concluded that the pilot model analysis provides a theoretical basis for determining the effect of control task on pilot opinions.

  5. FRAP Analysis: Accounting for Bleaching during Image Capture

    PubMed Central

    Wu, Jun; Shekhar, Nandini; Lele, Pushkar P.; Lele, Tanmay P.

    2012-01-01

    The analysis of Fluorescence Recovery After Photobleaching (FRAP) experiments involves mathematical modeling of the fluorescence recovery process. An important feature of FRAP experiments that tends to be ignored in the modeling is that there can be a significant loss of fluorescence due to bleaching during image capture. In this paper, we explicitly include the effects of bleaching during image capture in the model for the recovery process, instead of correcting for the effects of bleaching using reference measurements. Using experimental examples, we demonstrate the usefulness of such an approach in FRAP analysis. PMID:22912750

  6. Meta-analysis using Dirichlet process.

    PubMed

    Muthukumarana, Saman; Tiwari, Ram C

    2016-02-01

    This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. © The Author(s) 2012.

  7. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data

    PubMed Central

    Ying, Gui-shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-01-01

    Purpose To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. Methods We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field data in the elderly. Results When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI −0.03 to 0.32D, P=0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, P=0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller P-values, while analysis of the worse eye provided larger P-values than mixed effects models and marginal models. Conclusion In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision. PMID:28102741

  8. MIXREG: a computer program for mixed-effects regression analysis with autocorrelated errors.

    PubMed

    Hedeker, D; Gibbons, R D

    1996-05-01

    MIXREG is a program that provides estimates for a mixed-effects regression model (MRM) for normally-distributed response data including autocorrelated errors. This model can be used for analysis of unbalanced longitudinal data, where individuals may be measured at a different number of timepoints, or even at different timepoints. Autocorrelated errors of a general form or following an AR(1), MA(1), or ARMA(1,1) form are allowable. This model can also be used for analysis of clustered data, where the mixed-effects model assumes data within clusters are dependent. The degree of dependency is estimated jointly with estimates of the usual model parameters, thus adjusting for clustering. MIXREG uses maximum marginal likelihood estimation, utilizing both the EM algorithm and a Fisher-scoring solution. For the scoring solution, the covariance matrix of the random effects is expressed in its Gaussian decomposition, and the diagonal matrix reparameterized using the exponential transformation. Estimation of the individual random effects is accomplished using an empirical Bayes approach. Examples illustrating usage and features of MIXREG are provided.

  9. Application of clustering analysis in the prediction of photovoltaic power generation based on neural network

    NASA Astrophysics Data System (ADS)

    Cheng, K.; Guo, L. M.; Wang, Y. K.; Zafar, M. T.

    2017-11-01

    In order to select effective samples in the large number of data of PV power generation years and improve the accuracy of PV power generation forecasting model, this paper studies the application of clustering analysis in this field and establishes forecasting model based on neural network. Based on three different types of weather on sunny, cloudy and rainy days, this research screens samples of historical data by the clustering analysis method. After screening, it establishes BP neural network prediction models using screened data as training data. Then, compare the six types of photovoltaic power generation prediction models before and after the data screening. Results show that the prediction model combining with clustering analysis and BP neural networks is an effective method to improve the precision of photovoltaic power generation.

  10. A preliminary cost-effectiveness analysis of hepatitis E vaccination among pregnant women in epidemic regions.

    PubMed

    Zhao, Yueyuan; Zhang, Xuefeng; Zhu, Fengcai; Jin, Hui; Wang, Bei

    2016-08-02

    Objective To estimate the cost-effectiveness of hepatitis E vaccination among pregnant women in epidemic regions. Methods A decision tree model was constructed to evaluate the cost-effectiveness of 3 hepatitis E virus vaccination strategies from societal perspectives. The model parameters were estimated on the basis of published studies and experts' experience. Sensitivity analysis was used to evaluate the uncertainties of the model. Results Vaccination was more economically effective on the basis of the incremental cost-effectiveness ratio (ICER< 3 times China's per capital gross domestic product/quality-adjusted life years); moreover, screening and vaccination had higher QALYs and lower costs compared with universal vaccination. No parameters significantly impacted ICER in one-way sensitivity analysis, and probabilistic sensitivity analysis also showed screening and vaccination to be the dominant strategy. Conclusion Screening and vaccination is the most economical strategy for pregnant women in epidemic regions; however, further studies are necessary to confirm the efficacy and safety of the hepatitis E vaccines.

  11. Analyzing Multiple Outcomes in Clinical Research Using Multivariate Multilevel Models

    PubMed Central

    Baldwin, Scott A.; Imel, Zac E.; Braithwaite, Scott R.; Atkins, David C.

    2014-01-01

    Objective Multilevel models have become a standard data analysis approach in intervention research. Although the vast majority of intervention studies involve multiple outcome measures, few studies use multivariate analysis methods. The authors discuss multivariate extensions to the multilevel model that can be used by psychotherapy researchers. Method and Results Using simulated longitudinal treatment data, the authors show how multivariate models extend common univariate growth models and how the multivariate model can be used to examine multivariate hypotheses involving fixed effects (e.g., does the size of the treatment effect differ across outcomes?) and random effects (e.g., is change in one outcome related to change in the other?). An online supplemental appendix provides annotated computer code and simulated example data for implementing a multivariate model. Conclusions Multivariate multilevel models are flexible, powerful models that can enhance clinical research. PMID:24491071

  12. A Comprehensive Meta-Analysis of Triple P-Positive Parenting Program Using Hierarchical Linear Modeling: Effectiveness and Moderating Variables

    ERIC Educational Resources Information Center

    Nowak, Christoph; Heinrichs, Nina

    2008-01-01

    A meta-analysis encompassing all studies evaluating the impact of the Triple P-Positive Parenting Program on parent and child outcome measures was conducted in an effort to identify variables that moderate the program's effectiveness. Hierarchical linear models (HLM) with three levels of data were employed to analyze effect sizes. The results (N =…

  13. Linear mixed-effects modeling approach to FMRI group analysis

    PubMed Central

    Chen, Gang; Saad, Ziad S.; Britton, Jennifer C.; Pine, Daniel S.; Cox, Robert W.

    2013-01-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance–covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance–covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity for activation detection. The importance of hypothesis formulation is also illustrated in the simulations. Comparisons with alternative group analysis approaches and the limitations of LME are discussed in details. PMID:23376789

  14. Linear mixed-effects modeling approach to FMRI group analysis.

    PubMed

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity for activation detection. The importance of hypothesis formulation is also illustrated in the simulations. Comparisons with alternative group analysis approaches and the limitations of LME are discussed in details. Published by Elsevier Inc.

  15. Dive Distribution and Group Size Parameters for Marine Species Occurring in the U.S. Navy’s Atlantic and Hawaii-Southern California Training and Testing Study Areas

    DTIC Science & Technology

    2017-06-09

    in water temperature have an effect on the behavioral ecology of hawksbill turtles, with an increase in nocturnal dive duration with decreasing water...important element of the Navy’s comprehensive environmental planning is the acoustic effects analysis executed with the Navy Acoustic Effects Model...comprehensive environmental planning is the acoustic effects analysis executed with the Navy Acoustic Effects Model (NAEMO) software. NAEMO was

  16. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  17. Mean-field velocity difference model considering the average effect of multi-vehicle interaction

    NASA Astrophysics Data System (ADS)

    Guo, Yan; Xue, Yu; Shi, Yin; Wei, Fang-ping; Lü, Liang-zhong; He, Hong-di

    2018-06-01

    In this paper, a mean-field velocity difference model(MFVD) is proposed to describe the average effect of multi-vehicle interactions on the whole road. By stability analysis, the stability condition of traffic system is obtained. Comparison with stability of full velocity-difference (FVD) model and the completeness of MFVD model are discussed. The mKdV equation is derived from MFVD model through nonlinear analysis to reveal the traffic jams in the form of the kink-antikink density wave. Then the numerical simulation is performed and the results illustrate that the average effect of multi-vehicle interactions plays an important role in effectively suppressing traffic jam. The increase strength of the mean-field velocity difference in MFVD model can rapidly reduce traffic jam and enhance the stability of traffic system.

  18. Cost-Effectiveness of Procedures for Treatment of Ostium Secundum Atrial Septal Defects Occlusion Comparing Conventional Surgery and Septal Percutaneous Implant

    PubMed Central

    da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Senna, Kátia Marie Simões e.; Tura, Bernardo Rangel; Goulart, Marcelo Correia

    2014-01-01

    Objectives The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. Methods A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. Results The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. Conclusions The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation. PMID:25302806

  19. Cost-effectiveness of procedures for treatment of ostium secundum atrial septal defects occlusion comparing conventional surgery and septal percutaneous implant.

    PubMed

    da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Simões e Senna, Kátia Marie; Tura, Bernardo Rangel; Correia, Marcelo Goulart; Goulart, Marcelo Correia

    2014-01-01

    The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation.

  20. Longitudinal analysis of the strengths and difficulties questionnaire scores of the Millennium Cohort Study children in England using M-quantile random-effects regression.

    PubMed

    Tzavidis, Nikos; Salvati, Nicola; Schmid, Timo; Flouri, Eirini; Midouhas, Emily

    2016-02-01

    Multilevel modelling is a popular approach for longitudinal data analysis. Statistical models conventionally target a parameter at the centre of a distribution. However, when the distribution of the data is asymmetric, modelling other location parameters, e.g. percentiles, may be more informative. We present a new approach, M -quantile random-effects regression, for modelling multilevel data. The proposed method is used for modelling location parameters of the distribution of the strengths and difficulties questionnaire scores of children in England who participate in the Millennium Cohort Study. Quantile mixed models are also considered. The analyses offer insights to child psychologists about the differential effects of risk factors on children's outcomes.

  1. Approximate simulation model for analysis and optimization in engineering system design

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Computational support of the engineering design process routinely requires mathematical models of behavior to inform designers of the system response to external stimuli. However, designers also need to know the effect of the changes in design variable values on the system behavior. For large engineering systems, the conventional way of evaluating these effects by repetitive simulation of behavior for perturbed variables is impractical because of excessive cost and inadequate accuracy. An alternative is described based on recently developed system sensitivity analysis that is combined with extrapolation to form a model of design. This design model is complementary to the model of behavior and capable of direct simulation of the effects of design variable changes.

  2. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald; El-Azab, Anter; Pernice, Michael

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less

  3. Effects of Correlated Errors on the Analysis of Space Geodetic Data

    NASA Technical Reports Server (NTRS)

    Romero-Wolf, Andres; Jacobs, C. S.

    2011-01-01

    As thermal errors are reduced instrumental and troposphere correlated errors will increasingly become more important. Work in progress shows that troposphere covariance error models improve data analysis results. We expect to see stronger effects with higher data rates. Temperature modeling of delay errors may further reduce temporal correlations in the data.

  4. Statistical analysis of the effect of temperature and inlet humidities on the parameters of a semiempirical model of the internal resistance of a polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.

    2018-03-01

    The internal resistance of a PEM fuel cell depends on the operation conditions and on the current delivered by the cell. This work's goal is to obtain a semiempirical model able to reproduce the effect of the operation current on the internal resistance of an individual cell of a commercial PEM fuel cell stack; and to perform a statistical analysis in order to study the effect of the operation temperature and the inlet humidities on the parameters of the model. First, the internal resistance of the individual fuel cell operating in different operation conditions was experimentally measured for different DC currents, using the high frequency intercept of the impedance spectra. Then, a semiempirical model based on Springer and co-workers' model was proposed. This model is able to successfully reproduce the experimental trends. Subsequently, the curves of resistance versus DC current obtained for different operation conditions were fitted to the semiempirical model, and an analysis of variance (ANOVA) was performed in order to determine which factors have a statistically significant effect on each model parameter. Finally, a response surface method was applied in order to obtain a regression model.

  5. Impacts of Wake Effect and Time Delay on the Dynamic Analysis of Wind Farms Models

    ERIC Educational Resources Information Center

    El-Fouly, Tarek H. M.; El-Saadany, Ehab F.; Salama, Magdy M. A.

    2008-01-01

    This article investigates the impacts of proper modeling of the wake effects and wind speed delays, between different wind turbines' rows, on the dynamic performance accuracy of the wind farms models. Three different modeling scenarios were compared to highlight the impacts of wake effects and wind speed time-delay models. In the first scenario,…

  6. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less

  7. Stability analysis of a controlled mechanical system with parametric uncertainties in LuGre friction model

    NASA Astrophysics Data System (ADS)

    Sun, Yun-Hsiang; Sun, Yuming; Wu, Christine Qiong; Sepehri, Nariman

    2018-04-01

    Parameters of friction model identified for a specific control system development are not constants. They vary over time and have a significant effect on the control system stability. Although much research has been devoted to the stability analysis under parametric uncertainty, less attention has been paid to incorporating a realistic friction model into their analysis. After reviewing the common friction models for controller design, a modified LuGre friction model is selected to carry out the stability analysis in this study. Two parameters of the LuGre model, namely σ0 and σ1, are critical to the demonstration of dynamic friction features, yet the identification of which is difficult to carry out, resulting in a high level of uncertainties in their values. Aiming at uncovering the effect of the σ0 and σ1 variations on the control system stability, a servomechanism with modified LuGre friction model is investigated. Two set-point position controllers are synthesised based on the servomechanism model to form two case studies. Through Lyapunov exponents, it is clear that the variation of σ0 and σ1 has an obvious effect on the stabiltiy of the studied systems and should not be overlooked in the design phase.

  8. Analysis of out-of-plane thermal microactuators

    NASA Astrophysics Data System (ADS)

    Atre, Amarendra

    2006-02-01

    Out-of-plane thermal microactuators find applications in optical switches to motivate micromirrors. Accurate analysis of such actuators is beneficial for improving existing designs and constructing more energy efficient actuators. However, the analysis is complicated by the nonlinear deformation of the thermal actuators along with temperature-dependent properties of polysilicon. This paper describes the development, modeling issues and results of a three-dimensional multiphysics nonlinear finite element model of surface micromachined out-of-plane thermal actuators. The model includes conductive and convective cooling effects and takes into account the effect of variable air gap on the response of the actuator. The model is implemented to investigate the characteristics of two diverse MUMPs fabricated out-of-plane thermal actuators. Reasonable agreement is observed between simulated and measured results for the model that considers the influence of air gap on actuator response. The usefulness of the model is demonstrated by implementing it to observe the effect of actuator geometry variation on steady-state deflection response.

  9. Cost-effectiveness of rivaroxaban for stroke prevention in atrial fibrillation in the Portuguese setting.

    PubMed

    Morais, João; Aguiar, Carlos; McLeod, Euan; Chatzitheofilou, Ismini; Fonseca Santos, Isabel; Pereira, Sónia

    2014-09-01

    To project the long-term cost-effectiveness of treating non-valvular atrial fibrillation (AF) patients for stroke prevention with rivaroxaban compared to warfarin in Portugal. A Markov model was used that included health and treatment states describing the management and consequences of AF and its treatment. The model's time horizon was set at a patient's lifetime and each cycle at three months. The analysis was conducted from a societal perspective and a 5% discount rate was applied to both costs and outcomes. Treatment effect data were obtained from the pivotal phase III ROCKET AF trial. The model was also populated with utility values obtained from the literature and with cost data derived from official Portuguese sources. The outcomes of the model included life-years, quality-adjusted life-years (QALYs), incremental costs, and associated incremental cost-effectiveness ratios (ICERs). Extensive sensitivity analyses were undertaken to further assess the findings of the model. As there is evidence indicating underuse and underprescription of warfarin in Portugal, an additional analysis was performed using a mixed comparator composed of no treatment, aspirin, and warfarin, which better reflects real-world prescribing in Portugal. This cost-effectiveness analysis produced an ICER of €3895/QALY for the base-case analysis (vs. warfarin) and of €6697/QALY for the real-world prescribing analysis (vs. mixed comparator). The findings were robust when tested in sensitivity analyses. The results showed that rivaroxaban may be a cost-effective alternative compared with warfarin or real-world prescribing in Portugal. Copyright © 2014 Sociedade Portuguesa de Cardiologia. Published by Elsevier España. All rights reserved.

  10. Patient-specific bone modeling and analysis: the role of integration and automation in clinical adoption.

    PubMed

    Zadpoor, Amir A; Weinans, Harrie

    2015-03-18

    Patient-specific analysis of bones is considered an important tool for diagnosis and treatment of skeletal diseases and for clinical research aimed at understanding the etiology of skeletal diseases and the effects of different types of treatment on their progress. In this article, we discuss how integration of several important components enables accurate and cost-effective patient-specific bone analysis, focusing primarily on patient-specific finite element (FE) modeling of bones. First, the different components are briefly reviewed. Then, two important aspects of patient-specific FE modeling, namely integration of modeling components and automation of modeling approaches, are discussed. We conclude with a section on validation of patient-specific modeling results, possible applications of patient-specific modeling procedures, current limitations of the modeling approaches, and possible areas for future research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Effects-based strategy development through center of gravity and target system analysis

    NASA Astrophysics Data System (ADS)

    White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen

    2003-09-01

    This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.

  12. Hydrogeology, hydrologic effects of development, and simulation of groundwater flow in the Borrego Valley, San Diego County, California

    USGS Publications Warehouse

    Faunt, Claudia C.; Stamos, Christina L.; Flint, Lorraine E.; Wright, Michael T.; Burgess, Matthew K.; Sneed, Michelle; Brandt, Justin; Martin, Peter; Coes, Alissa L.

    2015-11-24

    This report documents and presents (1) an analysis of the conceptual model, (2) a description of the hydrologic features, (3) a compilation and analysis of water-quality data, (4) the measurement and analysis of land subsidence by using geophysical and remote sensing techniques, (5) the development and calibration of a two-dimensional borehole-groundwater-flow model to estimate aquifer hydraulic conductivities, (6) the development and calibration of a three-dimensional (3-D) integrated hydrologic flow model, (7) a water-availability analysis with respect to current climate variability and land use, and (8) potential future management scenarios. The integrated hydrologic model, referred to here as the “Borrego Valley Hydrologic Model” (BVHM), is a tool that can provide results with the accuracy needed for making water-management decisions, although potential future refinements and enhancements could further improve the level of spatial and temporal resolution and model accuracy. Because the model incorporates time-varying inflows and outflows, this tool can be used to evaluate the effects of temporal changes in recharge and pumping and to compare the relative effects of different water-management scenarios on the aquifer system. Overall, the development of the hydrogeologic and hydrologic models, data networks, and hydrologic analysis provides a basis for assessing surface and groundwater availability and potential water-resource management guidelines.

  13. Conclusion of LOD-score analysis for family data generated under two-locus models.

    PubMed

    Dizier, M H; Babron, M C; Clerget-Darpoux, F

    1996-06-01

    The power to detect linkage by the LOD-score method is investigated here for diseases that depend on the effects of two genes. The classical strategy is, first, to detect a major-gene (MG) effect by segregation analysis and, second, to seek for linkage with genetic markers by the LOD-score method using the MG parameters. We already showed that segregation analysis can lead to evidence for a MG effect for many two-locus models, with the estimates of the MG parameters being very different from those of the two genes involved in the disease. We show here that use of these MG parameter estimates in the LOD-score analysis may lead to a failure to detect linkage for some two-locus models. For these models, use of the sib-pair method gives a non-negligible increase of power to detect linkage. The linkage-homogeneity test among subsamples differing for the familial disease distribution provides evidence of parameter misspecification, when the MG parameters are used. Moreover, for most of the models, use of the MG parameters in LOD-score analysis leads to a large bias in estimation of the recombination fraction and sometimes also to a rejection of linkage for the true recombination fraction. A final important point is that a strong evidence of an MG effect, obtained by segregation analysis, does not necessarily imply that linkage will be detected for at least one of the two genes, even with the true parameters and with a close informative marker.

  14. Analytical transmissibility based transfer path analysis for multi-energy-domain systems using four-pole parameter theory

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Mohammad Jalali; Behdinan, Kamran

    2017-10-01

    The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.

  15. Four photon parametric amplification. [in unbiased Josephson junction

    NASA Technical Reports Server (NTRS)

    Parrish, P. T.; Feldman, M. J.; Ohta, H.; Chiao, R. Y.

    1974-01-01

    An analysis is presented describing four-photon parametric amplification in an unbiased Josephson junction. Central to the theory is the model of the Josephson effect as a nonlinear inductance. Linear, small signal analysis is applied to the two-fluid model of the Josephson junction. The gain, gain-bandwidth product, high frequency limit, and effective noise temperature are calculated for a cavity reflection amplifier. The analysis is extended to multiple (series-connected) junctions and subharmonic pumping.

  16. Reliability analysis in interdependent smart grid systems

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  17. Prediction models for clustered data: comparison of a random intercept and standard regression model

    PubMed Central

    2013-01-01

    Background When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Methods Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. Results The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. Conclusion The models with random intercept discriminate better than the standard model only if the cluster effect is used for predictions. The prediction model with random intercept had good calibration within clusters. PMID:23414436

  18. Prediction models for clustered data: comparison of a random intercept and standard regression model.

    PubMed

    Bouwmeester, Walter; Twisk, Jos W R; Kappen, Teus H; van Klei, Wilton A; Moons, Karel G M; Vergouwe, Yvonne

    2013-02-15

    When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. The models with random intercept discriminate better than the standard model only if the cluster effect is used for predictions. The prediction model with random intercept had good calibration within clusters.

  19. Latent Transition Analysis with a Mixture Item Response Theory Measurement Model

    ERIC Educational Resources Information Center

    Cho, Sun-Joo; Cohen, Allan S.; Kim, Seock-Ho; Bottge, Brian

    2010-01-01

    A latent transition analysis (LTA) model was described with a mixture Rasch model (MRM) as the measurement model. Unlike the LTA, which was developed with a latent class measurement model, the LTA-MRM permits within-class variability on the latent variable, making it more useful for measuring treatment effects within latent classes. A simulation…

  20. [Lake eutrophication modeling in considering climatic factors change: a review].

    PubMed

    Su, Jie-Qiong; Wang, Xuan; Yang, Zhi-Feng

    2012-11-01

    Climatic factors are considered as the key factors affecting the trophic status and its process in most lakes. Under the background of global climate change, to incorporate the variations of climatic factors into lake eutrophication models could provide solid technical support for the analysis of the trophic evolution trend of lake and the decision-making of lake environment management. This paper analyzed the effects of climatic factors such as air temperature, precipitation, sunlight, and atmosphere on lake eutrophication, and summarized the research results about the lake eutrophication modeling in considering in considering climatic factors change, including the modeling based on statistical analysis, ecological dynamic analysis, system analysis, and intelligent algorithm. The prospective approaches to improve the accuracy of lake eutrophication modeling with the consideration of climatic factors change were put forward, including 1) to strengthen the analysis of the mechanisms related to the effects of climatic factors change on lake trophic status, 2) to identify the appropriate simulation models to generate several scenarios under proper temporal and spatial scales and resolutions, and 3) to integrate the climatic factors change simulation, hydrodynamic model, ecological simulation, and intelligent algorithm into a general modeling system to achieve an accurate prediction of lake eutrophication under climatic change.

  1. Generalisability in economic evaluation studies in healthcare: a review and case studies.

    PubMed

    Sculpher, M J; Pang, F S; Manca, A; Drummond, M F; Golder, S; Urdahl, H; Davies, L M; Eastwood, A

    2004-12-01

    To review, and to develop further, the methods used to assess and to increase the generalisability of economic evaluation studies. Electronic databases. Methodological studies relating to economic evaluation in healthcare were searched. This included electronic searches of a range of databases, including PREMEDLINE, MEDLINE, EMBASE and EconLit, and manual searches of key journals. The case studies of a decision analytic model involved highlighting specific features of previously published economic studies related to generalisability and location-related variability. The case-study involving the secondary analysis of cost-effectiveness analyses was based on the secondary analysis of three economic studies using data from randomised trials. The factor most frequently cited as generating variability in economic results between locations was the unit costs associated with particular resources. In the context of studies based on the analysis of patient-level data, regression analysis has been advocated as a means of looking at variability in economic results across locations. These methods have generally accepted that some components of resource use and outcomes are exchangeable across locations. Recent studies have also explored, in cost-effectiveness analysis, the use of tests of heterogeneity similar to those used in clinical evaluation in trials. The decision analytic model has been the main means by which cost-effectiveness has been adapted from trial to non-trial locations. Most models have focused on changes to the cost side of the analysis, but it is clear that the effectiveness side may also need to be adapted between locations. There have been weaknesses in some aspects of the reporting in applied cost-effectiveness studies. These may limit decision-makers' ability to judge the relevance of a study to their specific situations. The case study demonstrated the potential value of multilevel modelling (MLM). Where clustering exists by location (e.g. centre or country), MLM can facilitate correct estimates of the uncertainty in cost-effectiveness results, and also a means of estimating location-specific cost-effectiveness. The review of applied economic studies based on decision analytic models showed that few studies were explicit about their target decision-maker(s)/jurisdictions. The studies in the review generally made more effort to ensure that their cost inputs were specific to their target jurisdiction than their effectiveness parameters. Standard sensitivity analysis was the main way of dealing with uncertainty in the models, although few studies looked explicitly at variability between locations. The modelling case study illustrated how effectiveness and cost data can be made location-specific. In particular, on the effectiveness side, the example showed the separation of location-specific baseline events and pooled estimates of relative treatment effect, where the latter are assumed exchangeable across locations. A large number of factors are mentioned in the literature that might be expected to generate variation in the cost-effectiveness of healthcare interventions across locations. Several papers have demonstrated differences in the volume and cost of resource use between locations, but few studies have looked at variability in outcomes. In applied trial-based cost-effectiveness studies, few studies provide sufficient evidence for decision-makers to establish the relevance or to adjust the results of the study to their location of interest. Very few studies utilised statistical methods formally to assess the variability in results between locations. In applied economic studies based on decision models, most studies either stated their target decision-maker/jurisdiction or provided sufficient information from which this could be inferred. There was a greater tendency to ensure that cost inputs were specific to the target jurisdiction than clinical parameters. Methods to assess generalisability and variability in economic evaluation studies have been discussed extensively in the literature relating to both trial-based and modelling studies. Regression-based methods are likely to offer a systematic approach to quantifying variability in patient-level data. In particular, MLM has the potential to facilitate estimates of cost-effectiveness, which both reflect the variation in costs and outcomes between locations and also enable the consistency of cost-effectiveness estimates between locations to be assessed directly. Decision analytic models will retain an important role in adapting the results of cost-effectiveness studies between locations. Recommendations for further research include: the development of methods of evidence synthesis which model the exchangeability of data across locations and allow for the additional uncertainty in this process; assessment of alternative approaches to specifying multilevel models to the analysis of cost-effectiveness data alongside multilocation randomised trials; identification of a range of appropriate covariates relating to locations (e.g. hospitals) in multilevel models; and further assessment of the role of econometric methods (e.g. selection models) for cost-effectiveness analysis alongside observational datasets, and to increase the generalisability of randomised trials.

  2. Microarray Meta-Analysis Identifies Acute Lung Injury Biomarkers in Donor Lungs That Predict Development of Primary Graft Failure in Recipients

    PubMed Central

    Haitsma, Jack J.; Furmli, Suleiman; Masoom, Hussain; Liu, Mingyao; Imai, Yumiko; Slutsky, Arthur S.; Beyene, Joseph; Greenwood, Celia M. T.; dos Santos, Claudia

    2012-01-01

    Objectives To perform a meta-analysis of gene expression microarray data from animal studies of lung injury, and to identify an injury-specific gene expression signature capable of predicting the development of lung injury in humans. Methods We performed a microarray meta-analysis using 77 microarray chips across six platforms, two species and different animal lung injury models exposed to lung injury with or/and without mechanical ventilation. Individual gene chips were classified and grouped based on the strategy used to induce lung injury. Effect size (change in gene expression) was calculated between non-injurious and injurious conditions comparing two main strategies to pool chips: (1) one-hit and (2) two-hit lung injury models. A random effects model was used to integrate individual effect sizes calculated from each experiment. Classification models were built using the gene expression signatures generated by the meta-analysis to predict the development of lung injury in human lung transplant recipients. Results Two injury-specific lists of differentially expressed genes generated from our meta-analysis of lung injury models were validated using external data sets and prospective data from animal models of ventilator-induced lung injury (VILI). Pathway analysis of gene sets revealed that both new and previously implicated VILI-related pathways are enriched with differentially regulated genes. Classification model based on gene expression signatures identified in animal models of lung injury predicted development of primary graft failure (PGF) in lung transplant recipients with larger than 80% accuracy based upon injury profiles from transplant donors. We also found that better classifier performance can be achieved by using meta-analysis to identify differentially-expressed genes than using single study-based differential analysis. Conclusion Taken together, our data suggests that microarray analysis of gene expression data allows for the detection of “injury" gene predictors that can classify lung injury samples and identify patients at risk for clinically relevant lung injury complications. PMID:23071521

  3. A theoretical study of non-adiabatic surface effects for a model in the NTF cryogenic wind tunnel

    NASA Technical Reports Server (NTRS)

    Macha, J. M.; Pare, L. A.; Landrum, D. B.

    1985-01-01

    A theoretical analysis was made of the severity and effect of nonadiabatic surface conditions for a model in the NTF cryogenic wind tunnel. The nonadiabatic condition arises from heaters that are used to maintain a constant thermal environment for instrumentation internal to the model. The analysis was made for several axi-symmetric representations of a fuselage cavity, using a finite element heat conduction code. Potential flow and boundary layer codes were used to calculate the convection condition for the exterior surface of the model. The results of the steady state analysis show that it is possible to maintain the surface temperature very near the adiabatic value, with the judicious use of insulating material. Even for the most severe nonadiabatic condition studied, the effects on skin friction drag and displacement thickness were only marginally significant. The thermal analysis also provided an estimate of the power required to maintain a specified cavity temperature.

  4. Multivariate Longitudinal Analysis with Bivariate Correlation Test.

    PubMed

    Adjakossa, Eric Houngla; Sadissou, Ibrahim; Hounkonnou, Mahouton Norbert; Nuel, Gregory

    2016-01-01

    In the context of multivariate multilevel data analysis, this paper focuses on the multivariate linear mixed-effects model, including all the correlations between the random effects when the dimensional residual terms are assumed uncorrelated. Using the EM algorithm, we suggest more general expressions of the model's parameters estimators. These estimators can be used in the framework of the multivariate longitudinal data analysis as well as in the more general context of the analysis of multivariate multilevel data. By using a likelihood ratio test, we test the significance of the correlations between the random effects of two dependent variables of the model, in order to investigate whether or not it is useful to model these dependent variables jointly. Simulation studies are done to assess both the parameter recovery performance of the EM estimators and the power of the test. Using two empirical data sets which are of longitudinal multivariate type and multivariate multilevel type, respectively, the usefulness of the test is illustrated.

  5. Using Visual Analysis to Evaluate and Refine Multilevel Models of Single-Case Studies

    ERIC Educational Resources Information Center

    Baek, Eun Kyeng; Petit-Bois, Merlande; Van den Noortgate, Wim; Beretvas, S. Natasha; Ferron, John M.

    2016-01-01

    In special education, multilevel models of single-case research have been used as a method of estimating treatment effects over time and across individuals. Although multilevel models can accurately summarize the effect, it is known that if the model is misspecified, inferences about the effects can be biased. Concern with the potential for model…

  6. Decision science and cervical cancer.

    PubMed

    Cantor, Scott B; Fahs, Marianne C; Mandelblatt, Jeanne S; Myers, Evan R; Sanders, Gillian D

    2003-11-01

    Mathematical modeling is an effective tool for guiding cervical cancer screening, diagnosis, and treatment decisions for patients and policymakers. This article describes the use of mathematical modeling as outlined in five presentations from the Decision Science and Cervical Cancer session of the Second International Conference on Cervical Cancer held at The University of Texas M. D. Anderson Cancer Center, April 11-14, 2002. The authors provide an overview of mathematical modeling, especially decision analysis and cost-effectiveness analysis, and examples of how it can be used for clinical decision making regarding the prevention, diagnosis, and treatment of cervical cancer. Included are applications as well as theory regarding decision science and cervical cancer. Mathematical modeling can answer such questions as the optimal frequency for screening, the optimal age to stop screening, and the optimal way to diagnose cervical cancer. Results from one mathematical model demonstrated that a vaccine against high-risk strains of human papillomavirus was a cost-effective use of resources, and discussion of another model demonstrated the importance of collecting direct non-health care costs and time costs for cost-effectiveness analysis. Research presented indicated that care must be taken when applying the results of population-wide, cost-effectiveness analyses to reduce health disparities. Mathematical modeling can encompass a variety of theoretical and applied issues regarding decision science and cervical cancer. The ultimate objective of using decision-analytic and cost-effectiveness models is to identify ways to improve women's health at an economically reasonable cost. Copyright 2003 American Cancer Society.

  7. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    PubMed Central

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  8. Variable-intercept panel model for deformation zoning of a super-high arch dam.

    PubMed

    Shi, Zhongwen; Gu, Chongshi; Qin, Dong

    2016-01-01

    This study determines dam deformation similarity indexes based on an analysis of deformation zoning features and panel data clustering theory, with comprehensive consideration to the actual deformation law of super-high arch dams and the spatial-temporal features of dam deformation. Measurement methods of these indexes are studied. Based on the established deformation similarity criteria, the principle used to determine the number of dam deformation zones is constructed through entropy weight method. This study proposes the deformation zoning method for super-high arch dams and the implementation steps, analyzes the effect of special influencing factors of different dam zones on the deformation, introduces dummy variables that represent the special effect of dam deformation, and establishes a variable-intercept panel model for deformation zoning of super-high arch dams. Based on different patterns of the special effect in the variable-intercept panel model, two panel analysis models were established to monitor fixed and random effects of dam deformation. Hausman test method of model selection and model effectiveness assessment method are discussed. Finally, the effectiveness of established models is verified through a case study.

  9. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  10. A mixed-effects regression model for longitudinal multivariate ordinal data.

    PubMed

    Liu, Li C; Hedeker, Donald

    2006-03-01

    A mixed-effects item response theory model that allows for three-level multivariate ordinal outcomes and accommodates multiple random subject effects is proposed for analysis of multivariate ordinal outcomes in longitudinal studies. This model allows for the estimation of different item factor loadings (item discrimination parameters) for the multiple outcomes. The covariates in the model do not have to follow the proportional odds assumption and can be at any level. Assuming either a probit or logistic response function, maximum marginal likelihood estimation is proposed utilizing multidimensional Gauss-Hermite quadrature for integration of the random effects. An iterative Fisher scoring solution, which provides standard errors for all model parameters, is used. An analysis of a longitudinal substance use data set, where four items of substance use behavior (cigarette use, alcohol use, marijuana use, and getting drunk or high) are repeatedly measured over time, is used to illustrate application of the proposed model.

  11. A web-based portfolio model as the students' final assignment: Dealing with the development of higher education trend

    NASA Astrophysics Data System (ADS)

    Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi

    2017-03-01

    This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.

  12. Discrete effect on the halfway bounce-back boundary condition of multiple-relaxation-time lattice Boltzmann model for convection-diffusion equations.

    PubMed

    Cui, Shuqi; Hong, Ning; Shi, Baochang; Chai, Zhenhua

    2016-04-01

    In this paper, we will focus on the multiple-relaxation-time (MRT) lattice Boltzmann model for two-dimensional convection-diffusion equations (CDEs), and analyze the discrete effect on the halfway bounce-back (HBB) boundary condition (or sometimes called bounce-back boundary condition) of the MRT model where three different discrete velocity models are considered. We first present a theoretical analysis on the discrete effect of the HBB boundary condition for the simple problems with a parabolic distribution in the x or y direction, and a numerical slip proportional to the second-order of lattice spacing is observed at the boundary, which means that the MRT model has a second-order convergence rate in space. The theoretical analysis also shows that the numerical slip can be eliminated in the MRT model through tuning the free relaxation parameter corresponding to the second-order moment, while it cannot be removed in the single-relaxation-time model or the Bhatnagar-Gross-Krook model unless the relaxation parameter related to the diffusion coefficient is set to be a special value. We then perform some simulations to confirm our theoretical results, and find that the numerical results are consistent with our theoretical analysis. Finally, we would also like to point out the present analysis can be extended to other boundary conditions of lattice Boltzmann models for CDEs.

  13. Dealing with Feeling: A Meta-Analysis of the Effectiveness of Strategies Derived from the Process Model of Emotion Regulation

    ERIC Educational Resources Information Center

    Webb, Thomas L.; Miles, Eleanor; Sheeran, Paschal

    2012-01-01

    The present meta-analysis investigated the effectiveness of strategies derived from the process model of emotion regulation in modifying emotional outcomes as indexed by experiential, behavioral, and physiological measures. A systematic search of the literature identified 306 experimental comparisons of different emotion regulation (ER)…

  14. Bias and Precision of Measures of Association for a Fixed-Effect Multivariate Analysis of Variance Model

    ERIC Educational Resources Information Center

    Kim, Soyoung; Olejnik, Stephen

    2005-01-01

    The sampling distributions of five popular measures of association with and without two bias adjusting methods were examined for the single factor fixed-effects multivariate analysis of variance model. The number of groups, sample sizes, number of outcomes, and the strength of association were manipulated. The results indicate that all five…

  15. Equations for hydraulic conductivity estimation from particle size distribution: A dimensional analysis

    NASA Astrophysics Data System (ADS)

    Wang, Ji-Peng; François, Bertrand; Lambert, Pierre

    2017-09-01

    Estimating hydraulic conductivity from particle size distribution (PSD) is an important issue for various engineering problems. Classical models such as Hazen model, Beyer model, and Kozeny-Carman model usually regard the grain diameter at 10% passing (d10) as an effective grain size and the effects of particle size uniformity (in Beyer model) or porosity (in Kozeny-Carman model) are sometimes embedded. This technical note applies the dimensional analysis (Buckingham's ∏ theorem) to analyze the relationship between hydraulic conductivity and particle size distribution (PSD). The porosity is regarded as a dependent variable on the grain size distribution in unconsolidated conditions. It indicates that the coefficient of grain size uniformity and a dimensionless group representing the gravity effect, which is proportional to the mean grain volume, are the main two determinative parameters for estimating hydraulic conductivity. Regression analysis is then carried out on a database comprising 431 samples collected from different depositional environments and new equations are developed for hydraulic conductivity estimation. The new equation, validated in specimens beyond the database, shows an improved prediction comparing to using the classic models.

  16. Future costs in cost effectiveness analysis.

    PubMed

    Lee, Robert H

    2008-07-01

    This paper resolves several controversies in CEA. Generalizing [Garber, A.M., Phelps, C.E., 1997. Economic foundations of cost-effectiveness analysis. Journal of Health Economics 16 (1), 1-31], the paper shows accounting for unrelated future costs distorts decision making. After replicating [Meltzer, D., 1997. Accounting for future costs in medical cost-effectiveness analysis. Journal of Health Economics 16 (1), 33-64] quite different conclusion that unrelated future costs should be included in CEA, the paper shows that Meltzer's findings result from modeling the budget constraint as an annuity, which is problematic. The paper also shows that related costs should be included in CEA. This holds for a variety of models, including a health maximization model. CEA should treat costs in the manner recommended by Garber and Phelps.

  17. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    NASA Astrophysics Data System (ADS)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze pollution; For Wuhan City, the method of adjusting the built-up area slightly and planning the non-built-up areas reasonably can be taken to reduce atmospheric haze pollution.

  18. Modelling the Effects of Information Campaigns Using Agent-Based Simulation

    DTIC Science & Technology

    2006-04-01

    individual i (±1). T=5 T=10 T=20 T=40 DSTO-TR-1853 9 The incorporation of media effects into Equation (1) results in a social impact model of the...that minority opinions often survived in a social margin [17]. Nevertheless, compared to the situation where there is no media effect in the simulation...analysis presented in this paper combines word-of-mouth communication and mass media broadcasting into a single line of analysis. The effects of

  19. Effects of Instructional Design with Mental Model Analysis on Learning.

    ERIC Educational Resources Information Center

    Hong, Eunsook

    This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…

  20. Ultrasonic density measurement cell design and simulation of non-ideal effects.

    PubMed

    Higuti, Ricardo Tokio; Buiochi, Flávio; Adamowski, Júlio Cezar; de Espinosa, Francisco Montero

    2006-07-01

    This paper presents a theoretical analysis of a density measurement cell using an unidimensional model composed by acoustic and electroacoustic transmission lines in order to simulate non-ideal effects. The model is implemented using matrix operations, and is used to design the cell considering its geometry, materials used in sensor assembly, range of liquid sample properties and signal analysis techniques. The sensor performance in non-ideal conditions is studied, considering the thicknesses of adhesive and metallization layers, and the effect of residue of liquid sample which can impregnate on the sample chamber surfaces. These layers are taken into account in the model, and their effects are compensated to reduce the error on density measurement. The results show the contribution of residue layer thickness to density error and its behavior when two signal analysis methods are used.

  1. Assessing model uncertainty using hexavalent chromium and ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective of this analysis is to characterize model uncertainty by evaluating the variance in estimates across several epidemiologic analyses.Methods: This analysis compared 7 publications analyzing two different chromate production sites in Ohio and Maryland. The Ohio cohort consisted of 482 workers employed from 1940-72, while the Maryland site employed 2,357 workers from 1950-74. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability in estimates across and within model forms. A total of 7 similarly parameterized analyses were considered across model forms, and 23 analyses with alternative parameterizations were considered within model form (14 Cox; 9 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients for 7 similar analyses ranged from 2.47

  2. Analysis and compensation for the effect of the catheter position on image intensities in intravascular optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, Shengnan; Eggermont, Jeroen; Wolterbeek, Ron; Broersen, Alexander; Busk, Carol A. G. R.; Precht, Helle; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2016-12-01

    Intravascular optical coherence tomography (IVOCT) is an imaging technique that is used to analyze the underlying cause of cardiovascular disease. Because a catheter is used during imaging, the intensities can be affected by the catheter position. This work aims to analyze the effect of the catheter position on IVOCT image intensities and to propose a compensation method to minimize this effect in order to improve the visualization and the automatic analysis of IVOCT images. The effect of catheter position is modeled with respect to the distance between the catheter and the arterial wall (distance-dependent factor) and the incident angle onto the arterial wall (angle-dependent factor). A light transmission model incorporating both factors is introduced. On the basis of this model, the interaction effect of both factors is estimated with a hierarchical multivariant linear regression model. Statistical analysis shows that IVOCT intensities are significantly affected by both factors with p<0.001, as either aspect increases the intensity decreases. This effect differs for different pullbacks. The regression results were used to compensate for this effect. Experiments show that the proposed compensation method can improve the performance of the automatic bioresorbable vascular scaffold strut detection.

  3. Modeling and Analysis of Process Parameters for Evaluating Shrinkage Problems During Plastic Injection Molding of a DVD-ROM Cover

    NASA Astrophysics Data System (ADS)

    Öktem, H.

    2012-01-01

    Plastic injection molding plays a key role in the production of high-quality plastic parts. Shrinkage is one of the most significant problems of a plastic part in terms of quality in the plastic injection molding. This article focuses on the study of the modeling and analysis of the effects of process parameters on the shrinkage by evaluating the quality of the plastic part of a DVD-ROM cover made with Acrylonitrile Butadiene Styrene (ABS) polymer material. An effective regression model was developed to determine the mathematical relationship between the process parameters (mold temperature, melt temperature, injection pressure, injection time, and cooling time) and the volumetric shrinkage by utilizing the analysis data. Finite element (FE) analyses designed by Taguchi (L27) orthogonal arrays were run in the Moldflow simulation program. Analysis of variance (ANOVA) was then performed to check the adequacy of the regression model and to determine the effect of the process parameters on the shrinkage. Experiments were conducted to control the accuracy of the regression model with the FE analyses obtained from Moldflow. The results show that the regression model agrees very well with the FE analyses and the experiments. From this, it can be concluded that this study succeeded in modeling the shrinkage problem in our application.

  4. Analysis of dengue fever risk using geostatistics model in bone regency

    NASA Astrophysics Data System (ADS)

    Amran, Stang, Mallongi, Anwar

    2017-03-01

    This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.

  5. A critical examination of the validity of simplified models for radiant heat transfer analysis.

    NASA Technical Reports Server (NTRS)

    Toor, J. S.; Viskanta, R.

    1972-01-01

    Examination of the directional effects of the simplified models by comparing the experimental data with the predictions based on simple and more detailed models for the radiation characteristics of surfaces. Analytical results indicate that the constant property diffuse and specular models do not yield the upper and lower bounds on local radiant heat flux. In general, the constant property specular analysis yields higher values of irradiation than the constant property diffuse analysis. A diffuse surface in the enclosure appears to destroy the effect of specularity of the other surfaces. Semigray and gray analyses predict the irradiation reasonably well provided that the directional properties and the specularity of the surfaces are taken into account. The uniform and nonuniform radiosity diffuse models are in satisfactory agreement with each other.

  6. A fractal model of effective stress of porous media and the analysis of influence factors

    NASA Astrophysics Data System (ADS)

    Li, Wei; Zhao, Huan; Li, Siqi; Sun, Wenfeng; Wang, Lei; Li, Bing

    2018-03-01

    The basic concept of effective stress describes the characteristics of fluid and solid interaction in porous media. In this paper, based on the theory of fractal geometry, a fractal model was built to analyze the relationship between the microstructure and the effective stress of porous media. From the microscopic point of view, the influence of effective stress on pore structure of porous media was demonstrated. Theoretical analysis and experimental results show that: (i) the fractal model of effective stress can be used to describe the relationship between effective stress and the microstructure of porous media; (ii) a linear increase in the effective stress leads to exponential increases in fractal dimension, porosity and pore number of the porous media, and causes a decreasing trend in the average pore radius.

  7. Understanding identifiability as a crucial step in uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Guillaume, J. H. A.; Hill, M. C.; Seo, L.

    2016-12-01

    The topic of identifiability analysis offers concepts and approaches to identify why unique model parameter values cannot be identified, and can suggest possible responses that either increase uniqueness or help to understand the effect of non-uniqueness on predictions. Identifiability analysis typically involves evaluation of the model equations and the parameter estimation process. Non-identifiability can have a number of undesirable effects. In terms of model parameters these effects include: parameters not being estimated uniquely even with ideal data; wildly different values being returned for different initialisations of a parameter optimisation algorithm; and parameters not being physically meaningful in a model attempting to represent a process. This presentation illustrates some of the drastic consequences of ignoring model identifiability analysis. It argues for a more cogent framework and use of identifiability analysis as a way of understanding model limitations and systematically learning about sources of uncertainty and their importance. The presentation specifically distinguishes between five sources of parameter non-uniqueness (and hence uncertainty) within the modelling process, pragmatically capturing key distinctions within existing identifiability literature. It enumerates many of the various approaches discussed in the literature. Admittedly, improving identifiability is often non-trivial. It requires thorough understanding of the cause of non-identifiability, and the time, knowledge and resources to collect or select new data, modify model structures or objective functions, or improve conditioning. But ignoring these problems is not a viable solution. Even simple approaches such as fixing parameter values or naively using a different model structure may have significant impacts on results which are too often overlooked because identifiability analysis is neglected.

  8. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. An elastic-plastic fracture mechanics analysis of weld-toe surface cracks in fillet welded T-butt joint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, B.

    1994-12-31

    This paper describes an elastic-plastic fracture mechanics (EPFM) study of shallow weld-toe cracks. Two limiting crack configurations, plane strain edge crack and semi-circular surface crack in fillet welded T-butt plate joint, were analyzed using the finite element method. Crack depth ranging from 2 to 40% of plate thickness were considered. The elastic-plastic analysis, assuming power-law hardening relationship and Mises yield criterion, was based on incremental plasticity theory. Tension and bending loads applied were monotonically increased to a level causing relatively large scale yielding at the crack tip. Effects of weld-notch geometry and ductile material modeling on prediction of fracture mechanicsmore » characterizing parameter were assessed. It was found that the weld-notch effect reduces and the effect of material modeling increases as crack depth increases. Material modeling is less important than geometric modeling in analysis of very shallow cracks but is more important for relatively deeper cracks, e.g. crack depth more than 20% of thickness. The effect of material modeling can be assessed using a simplified structural model. Weld magnification factors derived assuming linear elastic conditions can be applied to EPFM characterization.« less

  10. Evaluation of the effectiveness of laser in situ keratomileusis and photorefractive keratectomy for myopia: a meta-analysis.

    PubMed

    Yang, Xin-Jun; Yan, Hong-Tao; Nakahori, Yutaka

    2003-08-01

    To evaluate the effectiveness of laser in situ keratomileusis (LASIK) and photorefractive keratectomy (PRK) for correcting myopia. Study selection, data extraction, and quality assessment were performed by two of authors independently. Summary odds ratios and 95% confidence intervals were calculated by DerSimonian & Laird random-effects model and Mantel-Haenszel (fixed-effects) model. All calculations were based on an intention-to-treat and per protocol analysis. Five hundred and eighty eyes (476 patients) from 5 randomized controlled trials were included in this study. At > or = 6 months follow-up, by random-effects model, the pooled odds ratios (OR, for LASIK vs. PRK) of postoperative uncorrected visual acuity (UCVA) of 20/20 or better for all trials were 1.31 (95% CI=0.77-2.22) by per protocol analysis and 1.18 (95% CI=0.74-1.88) by intention-to-treat analysis. In the refractive outcome, the pooled OR of the postoperative spherical equivalent refraction within +/-0.5 diopter (D) of emmetropia did not show any statistical significance, for which the OR were 0.75 (95% CI=0.48-1.18) by per protocol analysis and 0.70 (95% CI=0.47-1.04) by intention-to-treat analysis. LASIK and PRK were found to be similarly effective for the correction of myopia from -1.5 to -15.0 D in a greater than 6 month follow-up.

  11. Linking Air Quality and Watershed Models for Environmental Assessments: Analysis of the Effects of Model-Specific Precipitation Estimates on Calculated Water Flux

    EPA Science Inventory

    Directly linking air quality and watershed models could provide an effective method for estimating spatially-explicit inputs of atmospheric contaminants to watershed biogeochemical models. However, to adequately link air and watershed models for wet deposition estimates, each mod...

  12. An effective convolutional neural network model for Chinese sentiment analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Chen, Mengdong; Liu, Lianzhong; Wang, Yadong

    2017-06-01

    Nowadays microblog is getting more and more popular. People are increasingly accustomed to expressing their opinions on Twitter, Facebook and Sina Weibo. Sentiment analysis of microblog has received significant attention, both in academia and in industry. So far, Chinese microblog exploration still needs lots of further work. In recent years CNN has also been used to deal with NLP tasks, and already achieved good results. However, these methods ignore the effective use of a large number of existing sentimental resources. For this purpose, we propose a Lexicon-based Sentiment Convolutional Neural Networks (LSCNN) model focus on Weibo's sentiment analysis, which combines two CNNs, trained individually base on sentiment features and word embedding, at the fully connected hidden layer. The experimental results show that our model outperforms the CNN model only with word embedding features on microblog sentiment analysis task.

  13. FMCSA safety program effectiveness measurement : carrier intervention effectiveness model, version 1.0 : [analysis brief].

    DOT National Transportation Integrated Search

    2015-01-01

    The Carrier Intervention Effectiveness Model (CIEM) : provides the Federal Motor Carrier Safety : Administration (FMCSA) with a tool for measuring : the safety benefits of carrier interventions conducted : under the Compliance, Safety, Accountability...

  14. [An ADAA model and its analysis method for agronomic traits based on the double-cross mating design].

    PubMed

    Xu, Z C; Zhu, J

    2000-01-01

    According to the double-cross mating design and using principles of Cockerham's general genetic model, a genetic model with additive, dominance and epistatic effects (ADAA model) was proposed for the analysis of agronomic traits. Components of genetic effects were derived for different generations. Monte Carlo simulation was conducted for analyzing the ADAA model and its reduced AD model by using different generations. It was indicated that genetic variance components could be estimated without bias by MINQUE(1) method and genetic effects could be predicted effectively by AUP method; at least three generations (including parent, F1 of single cross and F1 of double-cross) were necessary for analyzing the ADAA model and only two generations (including parent and F1 of double-cross) were enough for the reduced AD model. When epistatic effects were taken into account, a new approach for predicting the heterosis of agronomic traits of double-crosses was given on the basis of unbiased prediction of genotypic merits of parents and their crosses. In addition, genotype x environment interaction effects and interaction heterosis due to G x E interaction were discussed briefly.

  15. Cost-effectiveness of oral agents in relapsing-remitting multiple sclerosis compared to interferon-based therapy in Saudi Arabia.

    PubMed

    Alsaqa'aby, Mai F; Vaidya, Varun; Khreis, Noura; Khairallah, Thamer Al; Al-Jedai, Ahmed H

    2017-01-01

    Promising clinical and humanistic outcomes are associated with the use of new oral agents in the treatment of relapsing-remitting multiple sclerosis (RRMS). This is the first cost-effectiveness study comparing these medications in Saudi Arabia. We aimed to compare the cost-effectiveness of fingolimod, teriflunomide, dimethyl fumarate, and interferon (IFN)-b1a products (Avonex and Rebif) as first-line therapies in the treatment of patients with RRMS from a Saudi payer perspective. Cohort Simulation Model (Markov Model). Tertiary care hospital. A hypothetical cohort of 1000 RRMS Saudi patients was assumed to enter a Markov model model with a time horizon of 20 years and an annual cycle length. The model was developed based on an expanded disability status scale (EDSS) to evaluate the cost-effectiveness of the five disease-modifying drugs (DMDs) from a healthcare system perspective. Data on EDSS progression and relapse rates were obtained from the literature; cost data were obtained from King Faisal Specialist Hospital and Research Centre, Riyadh, Saudi Arabia. Results were expressed as incremental cost-effectiveness ratios (ICERs) and net monetary benefits (NMB) in Saudi Riyals and converted to equivalent $US. The base-case willingness-to-pay (WTP) threshold was assumed to be $100000 (SAR375000). One-way sensitivity analysis and probabilistic sensitivity analysis were conducted to test the robustness of the model. ICERs and NMB. The base-case analysis results showed Rebif as the optimal therapy at a WTP threshold of $100000. Avonex had the lowest ICER value of $337282/QALY when compared to Rebif. One-way sensitivity analysis demonstrated that the results were sensitive to utility weights of health state three and four and the cost of Rebif. None of the DMDs were found to be cost-effective in the treatment of RRMS at a WTP threshold of $100000 in this analysis. The DMDs would only be cost-effective at a WTP above $300000. The current analysis did not reflect the Saudi population preference in valuation of health states and did not consider the societal perspective in terms of cost.

  16. MDR-TB patients in KwaZulu-Natal, South Africa: Cost-effectiveness of 5 models of care

    PubMed Central

    Wallengren, Kristina; Reddy, Tarylee; Besada, Donela; Brust, James C. M.; Voce, Anna; Desai, Harsha; Ngozo, Jacqueline; Radebe, Zanele; Master, Iqbal; Padayatchi, Nesri; Daviaud, Emmanuelle

    2018-01-01

    Background South Africa has a high burden of MDR-TB, and to provide accessible treatment the government has introduced different models of care. We report the most cost-effective model after comparing cost per patient successfully treated across 5 models of care: centralized hospital, district hospitals (2), and community-based care through clinics or mobile injection teams. Methods In an observational study five cohorts were followed prospectively. The cost analysis adopted a provider perspective and economic cost per patient successfully treated was calculated based on country protocols and length of treatment per patient per model of care. Logistic regression was used to calculate propensity score weights, to compare pairs of treatment groups, whilst adjusting for baseline imbalances between groups. Propensity score weighted costs and treatment success rates were used in the ICER analysis. Sensitivity analysis focused on varying treatment success and length of hospitalization within each model. Results In 1,038 MDR-TB patients 75% were HIV-infected and 56% were successfully treated. The cost per successfully treated patient was 3 to 4.5 times lower in the community-based models with no hospitalization. Overall, the Mobile model was the most cost-effective. Conclusion Reducing the length of hospitalization and following community-based models of care improves the affordability of MDR-TB treatment without compromising its effectiveness. PMID:29668748

  17. Martian rampart crater ejecta - Experiments and analysis of melt-water interaction

    NASA Technical Reports Server (NTRS)

    Wohletz, K. H.; Sheridan, M. F.

    1983-01-01

    The possible effects of explosive water vaporization on ejecta emplacement after impact into a wet target are described. A general model is formulated from analysis of Viking imagery of Mars and experimental vapor explosions as well as consideration of fluidized particulate transport and lobate volcanic deposits. The discussed model contends that as target water content increases, the effects of vapor expansion due to impact increasingly modify the ballistic flow field during crater excavation. This modification results in transport by gravity-driven surface flowage, and is similar to that of atmospheric drag effects on ejecta modelled by Schultz and Gault (1979).

  18. Finite element modeling and analysis of tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.

    1983-01-01

    Predicting the response of tires under various loading conditions using finite element technology is addressed. Some of the recent advances in finite element technology which have high potential for application to tire modeling problems are reviewed. The analysis and modeling needs for tires are identified. Reduction methods for large-scale nonlinear analysis, with particular emphasis on treatment of combined loads, displacement-dependent and nonconservative loadings; development of simple and efficient mixed finite element models for shell analysis, identification of equivalent mixed and purely displacement models, and determination of the advantages of using mixed models; and effective computational models for large-rotation nonlinear problems, based on a total Lagrangian description of the deformation are included.

  19. Child-Centered Play Therapy in the Schools: Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Ray, Dee C.; Armstrong, Stephen A.; Balkin, Richard S.; Jayne, Kimberly M.

    2015-01-01

    The authors conducted a meta-analysis and systematic review that examined 23 studies evaluating the effectiveness of child centered play therapy (CCPT) conducted in elementary schools. Meta-analysis results were explored using a random effects model for mean difference and mean gain effect size estimates. Results revealed statistically significant…

  20. Scientific analysis is essential to assess biofuel policy effects: in response to the paper by Kim and Dale on "Indirect land use change for biofuels: Testing predictions and improving analytical methodologies"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kline, Keith L; Oladosu, Gbadebo A; Dale, Virginia H

    2011-01-01

    Vigorous debate on the effects of biofuels derives largely from the changes in land use estimated using economic models designed mainly for the analysis of agricultural trade and markets. The models referenced for land-use change (LUC) analysis in the U.S. Environmental Protection Agency Final Rule on the Renewable Fuel Standard include GTAP, FAPRI-CARD, and FASOM. To address bioenergy impacts, these models were expanded and modified to facilitate simulations of hypothesized LUC. However, even when models use similar basic assumptions and data, the range of LUC results can vary by ten-fold or more. While the market dynamics simulated in these modelsmore » include processes that are important in estimating effects of biofuel policies, the models have not been validated for estimating land-use changes and employ crucial assumptions and simplifications that contradict empirical evidence.« less

  1. Seven Modeling Perspectives on Teaching and Learning: Some Interrelations and Cognitive Effects

    ERIC Educational Resources Information Center

    Easley, J. A., Jr.

    1977-01-01

    The categories of models associated with the seven perspectives are designated as combinatorial models, sampling models, cybernetic models, game models, critical thinking models, ordinary language analysis models, and dynamic structural models. (DAG)

  2. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    A number of topics related to building a generalized distributed system model are discussed. The effects of distributed database modeling on evaluation of transaction rollbacks, the measurement of effects of distributed database models on transaction availability measures, and a performance analysis of static locking in replicated distributed database systems are covered.

  3. Comparison of methods for the analysis of relatively simple mediation models.

    PubMed

    Rijnhart, Judith J M; Twisk, Jos W R; Chinapaw, Mai J M; de Boer, Michiel R; Heymans, Martijn W

    2017-09-01

    Statistical mediation analysis is an often used method in trials, to unravel the pathways underlying the effect of an intervention on a particular outcome variable. Throughout the years, several methods have been proposed, such as ordinary least square (OLS) regression, structural equation modeling (SEM), and the potential outcomes framework. Most applied researchers do not know that these methods are mathematically equivalent when applied to mediation models with a continuous mediator and outcome variable. Therefore, the aim of this paper was to demonstrate the similarities between OLS regression, SEM, and the potential outcomes framework in three mediation models: 1) a crude model, 2) a confounder-adjusted model, and 3) a model with an interaction term for exposure-mediator interaction. Secondary data analysis of a randomized controlled trial that included 546 schoolchildren. In our data example, the mediator and outcome variable were both continuous. We compared the estimates of the total, direct and indirect effects, proportion mediated, and 95% confidence intervals (CIs) for the indirect effect across OLS regression, SEM, and the potential outcomes framework. OLS regression, SEM, and the potential outcomes framework yielded the same effect estimates in the crude mediation model, the confounder-adjusted mediation model, and the mediation model with an interaction term for exposure-mediator interaction. Since OLS regression, SEM, and the potential outcomes framework yield the same results in three mediation models with a continuous mediator and outcome variable, researchers can continue using the method that is most convenient to them.

  4. A Multilevel Analysis of Phase II of the Louisiana School Effectiveness Study.

    ERIC Educational Resources Information Center

    Kennedy, Eugene; And Others

    This paper presents findings of a study that used conventional modeling strategies (student- and school-level) and a new multilevel modeling strategy, Hierarchical Linear Modeling, to investigate school effects on student-achievement outcomes for data collected as part of Phase 2 of the Louisiana School Effectiveness Study. The purpose was to…

  5. Conclusion of LOD-score analysis for family data generated under two-locus models.

    PubMed Central

    Dizier, M. H.; Babron, M. C.; Clerget-Darpoux, F.

    1996-01-01

    The power to detect linkage by the LOD-score method is investigated here for diseases that depend on the effects of two genes. The classical strategy is, first, to detect a major-gene (MG) effect by segregation analysis and, second, to seek for linkage with genetic markers by the LOD-score method using the MG parameters. We already showed that segregation analysis can lead to evidence for a MG effect for many two-locus models, with the estimates of the MG parameters being very different from those of the two genes involved in the disease. We show here that use of these MG parameter estimates in the LOD-score analysis may lead to a failure to detect linkage for some two-locus models. For these models, use of the sib-pair method gives a non-negligible increase of power to detect linkage. The linkage-homogeneity test among subsamples differing for the familial disease distribution provides evidence of parameter misspecification, when the MG parameters are used. Moreover, for most of the models, use of the MG parameters in LOD-score analysis leads to a large bias in estimation of the recombination fraction and sometimes also to a rejection of linkage for the true recombination fraction. A final important point is that a strong evidence of an MG effect, obtained by segregation analysis, does not necessarily imply that linkage will be detected for at least one of the two genes, even with the true parameters and with a close informative marker. PMID:8651311

  6. A Meta-Analysis of Dunn and Dunn Model Correlational Research with Adult Populations

    ERIC Educational Resources Information Center

    Mangino, Christine

    2004-01-01

    The purpose of this investigation was to conduct a quantitative synthesis of correlational research that focused on the Dunn and Dunn Learning-Style Model and was concerned with adult populations. A total of 8,661 participants from the 47 original investigations provided 386 individual effect sizes for this meta-analysis. The mean effect size was…

  7. COMBATXXI: Usage and Analysis at TACOM

    DTIC Science & Technology

    2011-06-20

    Prescribed by ANSI Std Z39-18 Operational Effectiveness UNCLASSIFIED UNCLASSIFIED Outline  Who We Are  Our Equipment  Our Customers  COMBATXXI Model ...Research, Development and Engineering Center Our Customers 5 Operational Effectiveness UNCLASSIFIED UNCLASSIFIED Model Overview  Combined Arms...Analysis Tool for the 21st Century (COMBATXXI) - Developed jointly by TRAC- White Sands Missle Range (WSMR) and Marine Corps Combat Development Command

  8. Fully Bayesian Estimation of Data from Single Case Designs

    ERIC Educational Resources Information Center

    Rindskopf, David

    2013-01-01

    Single case designs (SCDs) generally consist of a small number of short time series in two or more phases. The analysis of SCDs statistically fits in the framework of a multilevel model, or hierarchical model. The usual analysis does not take into account the uncertainty in the estimation of the random effects. This not only has an effect on the…

  9. Chromaticity effects on head-tail instabilities for broadband impedance using two particle model, Vlasov analysis, and simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, Yong Ho; Chao, Alexander Wu; Blaskiewicz, Michael M.

    Effects of the chromaticity on head-tail instabilities for broadband impedances are comprehensively studied, using the two particle model, the Vlasov analysis and computer simulations. We show both in the two particle model and the Vlasov analysis with the trapezoidal (semiconstant) wake model that we can derive universal contour plots for the growth factor as a function of the two dimensionless parameters: the wakefield strength, Υ, and the difference of the betatron phase advances between the head and the tail, χ. They reveal how the chromaticity affects strong head-tail instabilities and excites head-tail instabilities. We also apply the LEP (Large Electron-Positronmore » Collider) broadband resonator model to the Vlasov approach and find that the results are in very good agreement with those of the trapezoidal wake model. The theoretical findings are also reinforced by the simulation results. In conclusion, the trapezoidal wake model turns out to be a very useful tool since it significantly simplifies the time domain analysis and provides well-behaved impedance at the same time.« less

  10. Chromaticity effects on head-tail instabilities for broadband impedance using two particle model, Vlasov analysis, and simulations

    DOE PAGES

    Chin, Yong Ho; Chao, Alexander Wu; Blaskiewicz, Michael M.; ...

    2017-07-28

    Effects of the chromaticity on head-tail instabilities for broadband impedances are comprehensively studied, using the two particle model, the Vlasov analysis and computer simulations. We show both in the two particle model and the Vlasov analysis with the trapezoidal (semiconstant) wake model that we can derive universal contour plots for the growth factor as a function of the two dimensionless parameters: the wakefield strength, Υ, and the difference of the betatron phase advances between the head and the tail, χ. They reveal how the chromaticity affects strong head-tail instabilities and excites head-tail instabilities. We also apply the LEP (Large Electron-Positronmore » Collider) broadband resonator model to the Vlasov approach and find that the results are in very good agreement with those of the trapezoidal wake model. The theoretical findings are also reinforced by the simulation results. In conclusion, the trapezoidal wake model turns out to be a very useful tool since it significantly simplifies the time domain analysis and provides well-behaved impedance at the same time.« less

  11. FMCSA safety program effectiveness measurement : carrier intervention effectiveness Model, version 1.1, analysis brief.

    DOT National Transportation Integrated Search

    2016-11-01

    The Carrier Intervention Effectiveness Model (CIEM) provides the Federal Motor Carrier Safety Administration (FMCSA) with a tool for measuring the safety benefits of carrier interventions conducted under the Compliance, Safety, Accountability (CSA) e...

  12. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  13. A Noncentral "t" Regression Model for Meta-Analysis

    ERIC Educational Resources Information Center

    Camilli, Gregory; de la Torre, Jimmy; Chiu, Chia-Yi

    2010-01-01

    In this article, three multilevel models for meta-analysis are examined. Hedges and Olkin suggested that effect sizes follow a noncentral "t" distribution and proposed several approximate methods. Raudenbush and Bryk further refined this model; however, this procedure is based on a normal approximation. In the current research literature, this…

  14. Effect Sizes for Growth-Modeling Analysis for Controlled Clinical Trials in the Same Metric as for Classical Analysis

    PubMed Central

    Feingold, Alan

    2009-01-01

    The use of growth-modeling analysis (GMA)--including Hierarchical Linear Models, Latent Growth Models, and General Estimating Equations--to evaluate interventions in psychology, psychiatry, and prevention science has grown rapidly over the last decade. However, an effect size associated with the difference between the trajectories of the intervention and control groups that captures the treatment effect is rarely reported. This article first reviews two classes of formulas for effect sizes associated with classical repeated-measures designs that use the standard deviation of either change scores or raw scores for the denominator. It then broadens the scope to subsume GMA, and demonstrates that the independent groups, within-subjects, pretest-posttest control-group, and GMA designs all estimate the same effect size when the standard deviation of raw scores is uniformly used. Finally, it is shown that the correct effect size for treatment efficacy in GMA--the difference between the estimated means of the two groups at end of study (determined from the coefficient for the slope difference and length of study) divided by the baseline standard deviation--is not reported in clinical trials. PMID:19271847

  15. Displacement-based back-analysis of the model parameters of the Nuozhadu high earth-rockfill dam.

    PubMed

    Wu, Yongkang; Yuan, Huina; Zhang, Bingyin; Zhang, Zongliang; Yu, Yuzhen

    2014-01-01

    The parameters of the constitutive model, the creep model, and the wetting model of materials of the Nuozhadu high earth-rockfill dam were back-analyzed together based on field monitoring displacement data by employing an intelligent back-analysis method. In this method, an artificial neural network is used as a substitute for time-consuming finite element analysis, and an evolutionary algorithm is applied for both network training and parameter optimization. To avoid simultaneous back-analysis of many parameters, the model parameters of the three main dam materials are decoupled and back-analyzed separately in a particular order. Displacement back-analyses were performed at different stages of the construction period, with and without considering the creep and wetting deformations. Good agreement between the numerical results and the monitoring data was obtained for most observation points, which implies that the back-analysis method and decoupling method are effective for solving complex problems with multiple models and parameters. The comparison of calculation results based on different sets of back-analyzed model parameters indicates the necessity of taking the effects of creep and wetting into consideration in the numerical analyses of high earth-rockfill dams. With the resulting model parameters, the stress and deformation distributions at completion are predicted and analyzed.

  16. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    PubMed

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Extension of the Haseman-Elston regression model to longitudinal data.

    PubMed

    Won, Sungho; Elston, Robert C; Park, Taesung

    2006-01-01

    We propose an extension to longitudinal data of the Haseman and Elston regression method for linkage analysis. The proposed model is a mixed model having several random effects. As response variable, we investigate the sibship sample mean corrected cross-product (smHE) and the BLUP-mean corrected cross product (pmHE), comparing them with the original squared difference (oHE), the overall mean corrected cross-product (rHE), and the weighted average of the squared difference and the squared mean-corrected sum (wHE). The proposed model allows for the correlation structure of longitudinal data. Also, the model can test for gene x time interaction to discover genetic variation over time. The model was applied in an analysis of the Genetic Analysis Workshop 13 (GAW13) simulated dataset for a quantitative trait simulating systolic blood pressure. Independence models did not preserve the test sizes, while the mixed models with both family and sibpair random effects tended to preserve size well. Copyright 2006 S. Karger AG, Basel.

  18. Analytical Modelling of the Effects of Different Gas Turbine Cooling Techniques on Engine Performance =

    NASA Astrophysics Data System (ADS)

    Uysal, Selcuk Can

    In this research, MATLAB SimulinkRTM was used to develop a cooled engine model for industrial gas turbines and aero-engines. The model consists of uncooled on-design, mean-line turbomachinery design and a cooled off-design analysis in order to evaluate the engine performance parameters by using operating conditions, polytropic efficiencies, material information and cooling system details. The cooling analysis algorithm involves a 2nd law analysis to calculate losses from the cooling technique applied. The model is used in a sensitivity analysis that evaluates the impacts of variations in metal Biot number, thermal barrier coating Biot number, film cooling effectiveness, internal cooling effectiveness and maximum allowable blade temperature on main engine performance parameters of aero and industrial gas turbine engines. The model is subsequently used to analyze the relative performance impact of employing Anti-Vortex Film Cooling holes (AVH) by means of data obtained for these holes by Detached Eddy Simulation-CFD Techniques that are valid for engine-like turbulence intensity conditions. Cooled blade configurations with AVH and other different external cooling techniques were used in a performance comparison study. (Abstract shortened by ProQuest.).

  19. Global sensitivity analysis of a local water balance model predicting evaporation, water yield and drought

    NASA Astrophysics Data System (ADS)

    Speich, Matthias; Zappa, Massimiliano; Lischke, Heike

    2017-04-01

    Evaporation and transpiration affect both catchment water yield and the growing conditions for vegetation. They are driven by climate, but also depend on vegetation, soil and land surface properties. In hydrological and land surface models, these properties may be included as constant parameters, or as state variables. Often, little is known about the effect of these variables on model outputs. In the present study, the effect of surface properties on evaporation was assessed in a global sensitivity analysis. To this effect, we developed a simple local water balance model combining state-of-the-art process formulations for evaporation, transpiration and soil water balance. The model is vertically one-dimensional, and the relative simplicity of its process formulations makes it suitable for integration in a spatially distributed model at regional scale. The main model outputs are annual total evaporation (TE, i.e. the sum of transpiration, soil evaporation and interception), and a drought index (DI), which is based on the ratio of actual and potential transpiration. This index represents the growing conditions for forest trees. The sensitivity analysis was conducted in two steps. First, a screening analysis was applied to identify unimportant parameters out of an initial set of 19 parameters. In a second step, a statistical meta-model was applied to a sample of 800 model runs, in which the values of the important parameters were varied. Parameter effect and interactions were analyzed with effects plots. The model was driven with forcing data from ten meteorological stations in Switzerland, representing a wide range of precipitation regimes across a strong temperature gradient. Of the 19 original parameters, eight were identified as important in the screening analysis. Both steps highlighted the importance of Plant Available Water Capacity (AWC) and Leaf Area Index (LAI). However, their effect varies greatly across stations. For example, while a transition from a sparse to a closed forest canopy has almost no effect on annual TE at warm and dry sites, it increases TE by up to 100 mm/year at cold-humid and warm-humid sites. Further parameters of importance describe infiltration, as well as canopy resistance and its response to environmental variables. This study offers insights for future development of hydrological and ecohydrological models. First, it shows that although local water balance is primarily controlled by climate, the vegetation and soil parameters may have a large impact on the outputs. Second, it indicates that modeling studies should prioritize a realistic parameterization of LAI and AWC, while other parameters may be set to fixed values. Third, it illustrates to which extent parameter effect and interactions depend on local climate.

  20. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  1. Assessments of higher-order ionospheric effects on GPS coordinate time series: A case study of CMONOC with longer time series

    NASA Astrophysics Data System (ADS)

    Jiang, Weiping; Deng, Liansheng; Zhou, Xiaohui; Ma, Yifang

    2014-05-01

    Higher-order ionospheric (HIO) corrections are proposed to become a standard part for precise GPS data analysis. For this study, we deeply investigate the impacts of the HIO corrections on the coordinate time series by implementing re-processing of the GPS data from Crustal Movement Observation Network of China (CMONOC). Nearly 13 year data are used in our three processing runs: (a) run NO, without HOI corrections, (b) run IG, both second- and third-order corrections are modeled using the International Geomagnetic Reference Field 11 (IGRF11) to model the magnetic field, (c) run ID, the same with IG but dipole magnetic model are applied. Both spectral analysis and noise analysis are adopted to investigate these effects. Results show that for CMONOC stations, HIO corrections are found to have brought an overall improvement. After the corrections are applied, the noise amplitudes decrease, with the white noise amplitudes showing a more remarkable variation. Low-latitude sites are more affected. For different coordinate components, the impacts vary. The results of an analysis of stacked periodograms show that there is a good match between the seasonal amplitudes and the HOI corrections, and the observed variations in the coordinate time series are related to HOI effects. HOI delays partially explain the seasonal amplitudes in the coordinate time series, especially for the U component. The annual amplitudes for all components are decreased for over one-half of the selected CMONOC sites. Additionally, the semi-annual amplitudes for the sites are much more strongly affected by the corrections. However, when diplole model is used, the results are not as optimistic as IGRF model. Analysis of dipole model indicate that HIO delay lead to the increase of noise amplitudes, and that HIO delays with dipole model can generate false periodic signals. When dipole model are used in modeling HIO terms, larger residual and noise are brought in rather than the effective improvements.

  2. Model Uncertainties for Valencia RPA Effect for MINERvA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gran, Richard

    2017-05-08

    This technical note describes the application of the Valencia RPA multi-nucleon effect and its uncertainty to QE reactions from the GENIE neutrino event generator. The analysis of MINERvA neutrino data in Rodrigues et al. PRL 116 071802 (2016) paper makes clear the need for an RPA suppression, especially at very low momentum and energy transfer. That published analysis does not constrain the magnitude of the effect; it only tests models with and without the effect against the data. Other MINERvA analyses need an expression of the model uncertainty in the RPA effect. A well-described uncertainty can be used for systematics for unfolding, for model errors in the analysis of non-QE samples, and as input for fitting exercises for model testing or constraining backgrounds. This prescription takes uncertainties on the parameters in the Valencia RPA model and adds a (not-as-tight) constraint from muon capture data. For MINERvA we apply it as a 2D (more » $$q_0$$,$$q_3$$) weight to GENIE events, in lieu of generating a full beyond-Fermi-gas quasielastic events. Because it is a weight, it can be applied to the generated and fully Geant4 simulated events used in analysis without a special GENIE sample. For some limited uses, it could be cast as a 1D $Q^2$ weight without much trouble. This procedure is a suitable starting point for NOvA and DUNE where the energy dependence is modest, but probably not adequate for T2K or MicroBooNE.« less

  3. Meta-analysis of diagnostic accuracy studies accounting for disease prevalence: alternative parameterizations and model selection.

    PubMed

    Chu, Haitao; Nie, Lei; Cole, Stephen R; Poole, Charles

    2009-08-15

    In a meta-analysis of diagnostic accuracy studies, the sensitivities and specificities of a diagnostic test may depend on the disease prevalence since the severity and definition of disease may differ from study to study due to the design and the population considered. In this paper, we extend the bivariate nonlinear random effects model on sensitivities and specificities to jointly model the disease prevalence, sensitivities and specificities using trivariate nonlinear random-effects models. Furthermore, as an alternative parameterization, we also propose jointly modeling the test prevalence and the predictive values, which reflect the clinical utility of a diagnostic test. These models allow investigators to study the complex relationship among the disease prevalence, sensitivities and specificities; or among test prevalence and the predictive values, which can reveal hidden information about test performance. We illustrate the proposed two approaches by reanalyzing the data from a meta-analysis of radiological evaluation of lymph node metastases in patients with cervical cancer and a simulation study. The latter illustrates the importance of carefully choosing an appropriate normality assumption for the disease prevalence, sensitivities and specificities, or the test prevalence and the predictive values. In practice, it is recommended to use model selection techniques to identify a best-fitting model for making statistical inference. In summary, the proposed trivariate random effects models are novel and can be very useful in practice for meta-analysis of diagnostic accuracy studies. Copyright 2009 John Wiley & Sons, Ltd.

  4. Effective model development of internal auditors in the village financial institution

    NASA Astrophysics Data System (ADS)

    Arsana, I. M. M.; Sugiarta, I. N.

    2018-01-01

    Designing an effective audit system is complex and challenging, and a focus on examining how internal audit drive improvement in three core performance dimensions ethicality, efficiency, and effectiveness in organization is needed. The problem of research is how the desain model and peripheral of supporter of effective supervation Village Credit Institution? Research of objectives is yielding the desain model and peripheral of supporter of effective supervation Village Credit Institution. Method Research use data collecting technique interview, observation and enquette. Data analysis, data qualitative before analysed to be turned into quantitative data in the form of scale. Each variable made to become five classificat pursuant to scale of likert. Data analysed descriptively to find supervation level, Structural Equation Model (SEM) to find internal and eksternal factor. So that desain model supervation with descriptive analysis. Result of research desain model and peripheral of supporter of effective supervation Village Credit Institution. The conclusion desain model supported by three sub system: sub system institute yield body supervisor of Village Credit Institution, sub system standardization and working procedure yield standard operating procedure supervisor of Village Credit Institution, sub system education and training yield supervisor professional of Village Credit Institution.

  5. Effects of additional data on Bayesian clustering.

    PubMed

    Yamazaki, Keisuke

    2017-10-01

    Hierarchical probabilistic models, such as mixture models, are used for cluster analysis. These models have two types of variables: observable and latent. In cluster analysis, the latent variable is estimated, and it is expected that additional information will improve the accuracy of the estimation of the latent variable. Many proposed learning methods are able to use additional data; these include semi-supervised learning and transfer learning. However, from a statistical point of view, a complex probabilistic model that encompasses both the initial and additional data might be less accurate due to having a higher-dimensional parameter. The present paper presents a theoretical analysis of the accuracy of such a model and clarifies which factor has the greatest effect on its accuracy, the advantages of obtaining additional data, and the disadvantages of increasing the complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Space station interior noise analysis program

    NASA Technical Reports Server (NTRS)

    Stusnick, E.; Burn, M.

    1987-01-01

    Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.

  7. A complete graphical criterion for the adjustment formula in mediation analysis.

    PubMed

    Shpitser, Ilya; VanderWeele, Tyler J

    2011-03-04

    Various assumptions have been used in the literature to identify natural direct and indirect effects in mediation analysis. These effects are of interest because they allow for effect decomposition of a total effect into a direct and indirect effect even in the presence of interactions or non-linear models. In this paper, we consider the relation and interpretation of various identification assumptions in terms of causal diagrams interpreted as a set of non-parametric structural equations. We show that for such causal diagrams, two sets of assumptions for identification that have been described in the literature are in fact equivalent in the sense that if either set of assumptions holds for all models inducing a particular causal diagram, then the other set of assumptions will also hold for all models inducing that diagram. We moreover build on prior work concerning a complete graphical identification criterion for covariate adjustment for total effects to provide a complete graphical criterion for using covariate adjustment to identify natural direct and indirect effects. Finally, we show that this criterion is equivalent to the two sets of independence assumptions used previously for mediation analysis.

  8. A Comparison of the Effectiveness of Two Design Methodologies in a Secondary School Setting.

    ERIC Educational Resources Information Center

    Cannizzaro, Brenton; Boughton, Doug

    1998-01-01

    Examines the effectiveness of the analysis-synthesis and generator-conjuncture-analysis models of design education. Concludes that the generator-conjecture-analysis design method produced student design product of a slightly higher standard than the analysis-synthesis design method. Discusses the findings in more detail and considers implications.…

  9. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  10. COBRA ATD minefield detection model initial performance analysis

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.

  11. Cause-Effect Analysis: Improvement of a First Year Engineering Students' Calculus Teaching Model

    ERIC Educational Resources Information Center

    van der Hoff, Quay; Harding, Ansie

    2017-01-01

    This study focuses on the mathematics department at a South African university and in particular on teaching of calculus to first year engineering students. The paper reports on a cause-effect analysis, often used for business improvement. The cause-effect analysis indicates that there are many factors that impact on secondary school teaching of…

  12. Meta-Analysis of Planetarium Efficacy Research

    ERIC Educational Resources Information Center

    Brazell, Bruce D.; Espinoza, Sue

    2009-01-01

    In this study, the instructional effectiveness of the planetarium in astronomy education was explored through a meta-analysis of 19 studies. This analysis resulted in a heterogeneous distribution of 24 effect sizes with a mean of +0.28, p less than 0.05. The variability in this distribution was not fully explained under a fixed effect model. As a…

  13. Pharmacokinetic and pharmacodynamic analysis comparing diverse effects of detomidine, medetomidine, and dexmedetomidine in the horse: a population analysis.

    PubMed

    Grimsrud, K N; Ait-Oudhia, S; Durbin-Johnson, B P; Rocke, D M; Mama, K R; Rezende, M L; Stanley, S D; Jusko, W J

    2015-02-01

    The present study characterizes the pharmacokinetic (PK) and pharmacodynamic (PD) relationships of the α2-adrenergic receptor agonists detomidine (DET), medetomidine (MED) and dexmedetomidine (DEX) in parallel groups of horses from in vivo data after single bolus doses. Head height (HH), heart rate (HR), and blood glucose concentrations were measured over 6 h. Compartmental PK and minimal physiologically based PK (mPBPK) models were applied and incorporated into basic and extended indirect response models (IRM). Population PK/PD analysis was conducted using the Monolix software implementing the stochastic approximation expectation maximization algorithm. Marked reductions in HH and HR were found. The drug concentrations required to obtain inhibition at half-maximal effect (IC50 ) were approximately four times larger for DET than MED and DEX for both HH and HR. These effects were not gender dependent. Medetomidine had a greater influence on the increase in glucose concentration than DEX. The developed models demonstrate the use of mechanistic and mPBPK/PD models for the analysis of clinically obtainable in vivo data. © 2014 John Wiley & Sons Ltd.

  14. Pharmacokinetic and pharmacodynamic analysis comparing diverse effects of detomidine, medetomidine, and dexmedetomidine in the horse: a population analysis

    PubMed Central

    Grimsrud, K. N.; Ait-Oudhia, S.; Durbin-Johnson, B. P.; Rocke, D. M.; Mama, K. R.; Rezende, M. L.; Stanley, S. D.; Jusko, W. J.

    2014-01-01

    The present study characterizes the pharmacokinetic (PK) and pharmacodynamic (PD) relationships of the α2-adrenergic receptor agonists detomidine (DET), medetomidine (MED) and dexmedetomidine (DEX) in parallel groups of horses from in vivo data after single bolus doses. Head height (HH), heart rate (HR), and blood glucose concentrations were measured over 6 h. Compartmental PK and minimal physiologically based PK (mPBPK) models were applied and incorporated into basic and extended indirect response models (IRM). Population PK/PD analysis was conducted using the Monolix software implementing the stochastic approximation expectation maximization algorithm. Marked reductions in HH and HR were found. The drug concentrations required to obtain inhibition at half-maximal effect (IC50) were approximately four times larger for DET than MED and DEX for both HH and HR. These effects were not gender dependent. Medetomidine had a greater influence on the increase in glucose concentration than DEX. The developed models demonstrate the use of mechanistic and mPBPK/PD models for the analysis of clinically obtainable in vivo data. PMID:25073816

  15. Quasielastic charged-current neutrino scattering in the scaling model with relativistic effective mass

    NASA Astrophysics Data System (ADS)

    Ruiz Simo, I.; Martinez-Consentino, V. L.; Amaro, J. E.; Ruiz Arriola, E.

    2018-06-01

    We use a recent scaling analysis of the quasielastic electron scattering data from C 12 to predict the quasielastic charge-changing neutrino scattering cross sections within an uncertainty band. We use a scaling function extracted from a selection of the (e ,e') cross section data, and an effective nucleon mass inspired by the relativistic mean-field model of nuclear matter. The corresponding superscaling analysis with relativistic effective mass (SuSAM*) describes a large amount of the electron data lying inside a phenomenological quasielastic band. The effective mass incorporates the enhancement of the transverse current produced by the relativistic mean field. The scaling function incorporates nuclear effects beyond the impulse approximation, in particular meson-exchange currents and short-range correlations producing tails in the scaling function. Besides its simplicity, this model describes the neutrino data as reasonably well as other more sophisticated nuclear models.

  16. Predicting Pilot Performance in Off-Nominal Conditions: A Meta-Analysis and Model Validation

    NASA Technical Reports Server (NTRS)

    Wickens, C.D.; Hooey, B.L.; Gore, B.F.; Sebok, A.; Koenecke, C.; Salud, E.

    2009-01-01

    Pilot response to off-nominal (very rare) events represents a critical component to understanding the safety of next generation airspace technology and procedures. We describe a meta-analysis designed to integrate the existing data regarding pilot accuracy of detecting rare, unexpected events such as runway incursions in realistic flight simulations. Thirty-five studies were identified and pilot responses were categorized by expectancy, event location, and whether the pilot was flying with a highway-in-the-sky display. All three dichotomies produced large, significant effects on event miss rate. A model of human attention and noticing, N-SEEV, was then used to predict event noticing performance as a function of event salience and expectancy, and retinal eccentricity. Eccentricity is predicted from steady state scanning by the SEEV model of attention allocation. The model was used to predict miss rates for the expectancy, location and highway-in-the-sky (HITS) effects identified in the meta-analysis. The correlation between model-predicted results and data from the meta-analysis was 0.72.

  17. Twice random, once mixed: applying mixed models to simultaneously analyze random effects of language and participants.

    PubMed

    Janssen, Dirk P

    2012-03-01

    Psychologists, psycholinguists, and other researchers using language stimuli have been struggling for more than 30 years with the problem of how to analyze experimental data that contain two crossed random effects (items and participants). The classical analysis of variance does not apply; alternatives have been proposed but have failed to catch on, and a statistically unsatisfactory procedure of using two approximations (known as F(1) and F(2)) has become the standard. A simple and elegant solution using mixed model analysis has been available for 15 years, and recent improvements in statistical software have made mixed models analysis widely available. The aim of this article is to increase the use of mixed models by giving a concise practical introduction and by giving clear directions for undertaking the analysis in the most popular statistical packages. The article also introduces the DJMIXED: add-on package for SPSS, which makes entering the models and reporting their results as straightforward as possible.

  18. Gait characterization in golden retriever muscular dystrophy dogs using linear discriminant analysis.

    PubMed

    Fraysse, Bodvaël; Barthélémy, Inès; Qannari, El Mostafa; Rouger, Karl; Thorin, Chantal; Blot, Stéphane; Le Guiner, Caroline; Chérel, Yan; Hogrel, Jean-Yves

    2017-04-12

    Accelerometric analysis of gait abnormalities in golden retriever muscular dystrophy (GRMD) dogs is of limited sensitivity, and produces highly complex data. The use of discriminant analysis may enable simpler and more sensitive evaluation of treatment benefits in this important preclinical model. Accelerometry was performed twice monthly between the ages of 2 and 12 months on 8 healthy and 20 GRMD dogs. Seven accelerometric parameters were analysed using linear discriminant analysis (LDA). Manipulation of the dependent and independent variables produced three distinct models. The ability of each model to detect gait alterations and their pattern change with age was tested using a leave-one-out cross-validation approach. Selecting genotype (healthy or GRMD) as the dependent variable resulted in a model (Model 1) allowing a good discrimination between the gait phenotype of GRMD and healthy dogs. However, this model was not sufficiently representative of the disease progression. In Model 2, age in months was added as a supplementary dependent variable (GRMD_2 to GRMD_12 and Healthy_2 to Healthy_9.5), resulting in a high overall misclassification rate (83.2%). To improve accuracy, a third model (Model 3) was created in which age was also included as an explanatory variable. This resulted in an overall misclassification rate lower than 12%. Model 3 was evaluated using blinded data pertaining to 81 healthy and GRMD dogs. In all but one case, the model correctly matched gait phenotype to the actual genotype. Finally, we used Model 3 to reanalyse data from a previous study regarding the effects of immunosuppressive treatments on muscular dystrophy in GRMD dogs. Our model identified significant effect of immunosuppressive treatments on gait quality, corroborating the original findings, with the added advantages of direct statistical analysis with greater sensitivity and more comprehensible data representation. Gait analysis using LDA allows for improved analysis of accelerometry data by applying a decision-making analysis approach to the evaluation of preclinical treatment benefits in GRMD dogs.

  19. Clinical and multiple gene expression variables in survival analysis of breast cancer: Analysis with the hypertabastic survival model

    PubMed Central

    2012-01-01

    Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496

  20. Seventh Grade Students' Mental Models of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Shepardson, Daniel P.; Choi, Soyoung; Niyogi, Dev; Charusombat, Umarporn

    2011-01-01

    This constructivist study investigates 225 student drawings and explanations from three different schools in the midwest in the US, to identify seventh grade students' mental models of the greenhouse effect. Five distinct mental models were derived from an inductive analysis of the content of the students' drawings and explanations: Model 1, a…

  1. Robustness of Value-Added Analysis of School Effectiveness. Research Report. ETS RR-08-22

    ERIC Educational Resources Information Center

    Braun, Henry; Qu, Yanxuan

    2008-01-01

    This paper reports on a study conducted to investigate the consistency of the results between 2 approaches to estimating school effectiveness through value-added modeling. Estimates of school effects from the layered model employing item response theory (IRT) scaled data are compared to estimates derived from a discrete growth model based on the…

  2. Interpretable inference on the mixed effect model with the Box-Cox transformation.

    PubMed

    Maruo, K; Yamaguchi, Y; Noma, H; Gosho, M

    2017-07-10

    We derived results for inference on parameters of the marginal model of the mixed effect model with the Box-Cox transformation based on the asymptotic theory approach. We also provided a robust variance estimator of the maximum likelihood estimator of the parameters of this model in consideration of the model misspecifications. Using these results, we developed an inference procedure for the difference of the model median between treatment groups at the specified occasion in the context of mixed effects models for repeated measures analysis for randomized clinical trials, which provided interpretable estimates of the treatment effect. From simulation studies, it was shown that our proposed method controlled type I error of the statistical test for the model median difference in almost all the situations and had moderate or high performance for power compared with the existing methods. We illustrated our method with cluster of differentiation 4 (CD4) data in an AIDS clinical trial, where the interpretability of the analysis results based on our proposed method is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Noise model for low-frequency through-the-Earth communication

    NASA Astrophysics Data System (ADS)

    Raab, Frederick H.

    2010-12-01

    Analysis and simulation of through-the-Earth communication links and signal processing techniques require a more complete noise model than is needed for the analysis of conventional communication systems. This paper presents a multicomponent noise model that includes impulsive characteristics, direction-of-arrival characteristics, and effects of local geology. The noise model is derived from theoretical considerations and confirmed by field tests.

  4. The Effect of Learning Cycle Models on Achievement of Students: A Meta-Analysis Study

    ERIC Educational Resources Information Center

    Sarac, Hakan

    2018-01-01

    In the study, a meta-analysis was conducted to determine the effect of the use of the learning cycle model on the achievements of the students. Doctorate and master theses, made between 2007 and 2016, were searched using the keywords in Turkish and English. As a result of the screening, a total of 123 dissertations, which used learning cycle…

  5. An analysis of USSPACECOM's space surveillance network sensor tasking methodology

    NASA Astrophysics Data System (ADS)

    Berger, Jeff M.; Moles, Joseph B.; Wilsey, David G.

    1992-12-01

    This study provides the basis for the development of a cost/benefit assessment model to determine the effects of alterations to the Space Surveillance Network (SSN) on orbital element (OE) set accuracy. It provides a review of current methods used by NORAD and the SSN to gather and process observations, an alternative to the current Gabbard classification method, and the development of a model to determine the effects of observation rate and correction interval on OE set accuracy. The proposed classification scheme is based on satellite J2 perturbations. Specifically, classes were established based on mean motion, eccentricity, and inclination since J2 perturbation effects are functions of only these elements. Model development began by creating representative sensor observations using a highly accurate orbital propagation model. These observations were compared to predicted observations generated using the NORAD Simplified General Perturbation (SGP4) model and differentially corrected using a Bayes, sequential estimation, algorithm. A 10-run Monte Carlo analysis was performed using this model on 12 satellites using 16 different observation rate/correction interval combinations. An ANOVA and confidence interval analysis of the results show that this model does demonstrate the differences in steady state position error based on varying observation rate and correction interval.

  6. Performance analysis of junctionless double gate VeSFET considering the effects of thermal variation - An explicit 2D analytical model

    NASA Astrophysics Data System (ADS)

    Chaudhary, Tarun; Khanna, Gargi

    2017-03-01

    The purpose of this paper is to explore junctionless double gate vertical slit field effect transistor (JLDG VeSFET) with reduced short channel effects and to develop an analytical threshold voltage model for the device considering the impact of thermal variations for the very first time. The model has been derived by solving 2D Poisson's equation and the effects of variation in temperature on various electrical parameters of the device such as Rout, drain current, mobility, subthreshold slope and DIBL has been studied and described in the paper. The model provides a deep physical insight of the device behavior and is also very helpful in contributing to the design space exploration for JLDG VeSFET. The proposed model is verified with simulative analysis at different radii of the device and it has been observed that there is a good agreement between the analytical model and simulation results.

  7. Publication bias and the limited strength model of self-control: has the evidence for ego depletion been overestimated?

    PubMed

    Carter, Evan C; McCullough, Michael E

    2014-01-01

    Few models of self-control have generated as much scientific interest as has the limited strength model. One of the entailments of this model, the depletion effect, is the expectation that acts of self-control will be less effective when they follow prior acts of self-control. Results from a previous meta-analysis concluded that the depletion effect is robust and medium in magnitude (d = 0.62). However, when we applied methods for estimating and correcting for small-study effects (such as publication bias) to the data from this previous meta-analysis effort, we found very strong signals of publication bias, along with an indication that the depletion effect is actually no different from zero. We conclude that until greater certainty about the size of the depletion effect can be established, circumspection about the existence of this phenomenon is warranted, and that rather than elaborating on the model, research efforts should focus on establishing whether the basic effect exists. We argue that the evidence for the depletion effect is a useful case study for illustrating the dangers of small-study effects as well as some of the possible tools for mitigating their influence in psychological science.

  8. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY PERIODICALS, INC.

  9. Analysis of longwave radiation for the Earth-atmosphere system

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Venuru, C. S.; Subramanian, S. V.

    1983-01-01

    Accurate radiative transfer models are used to determine the upwelling atmospheric radiance and net radiative flux in the entire longwave spectral range. The validity of the quasi-random band model is established by comparing the results of this model with those of line-by-line formulations and with available theoretical and experimental results. Existing radiative transfer models and computer codes are modified to include various surface and atmospheric effects (surface reflection, nonequilibrium radiation, and cloud effects). The program is used to evaluate the radiative flux in clear atmosphere, provide sensitivity analysis of upwelling radiance in the presence of clouds, and determine the effects of various climatological parameters on the upwelling radiation and anisotropic function. Homogeneous and nonhomogeneous gas emissivities can also be evaluated under different conditions.

  10. The effects of differentiation of self, adult attachment, and sexual communication on sexual and marital satisfaction: a path analysis.

    PubMed

    Timm, Tina M; Keiley, Margaret K

    2011-01-01

    This article explores the relations among differentiation of self, adult attachment, sexual communication, sexual satisfaction, and marital satisfaction, in a path analysis model. In a sample of 205 married adults, the path analysis results indicated that (a) differentiation of self had no direct effect on marital or sexual satisfaction, although it was significantly related to sexual communication; (b) adult attachment had a direct effect on marital satisfaction, but not on sexual satisfaction; (c) sexual communication is a mediating variable; (d) sexual communication was positively related to sexual satisfaction and marital satisfaction; and (e) no gender differences existed in the model.

  11. Effect of practical training on the learning motivation profile of Japanese pharmacy students using structural equation modeling.

    PubMed

    Yamamura, Shigeo; Takehira, Rieko

    2017-01-01

    To establish a model of Japanese pharmacy students' learning motivation profile and investigate the effects of pharmaceutical practical training programs on their learning motivation. The Science Motivation Questionnaire II was administered to pharmacy students in their 4th (before practical training), 5th (before practical training at clinical sites), and 6th (after all practical training) years of study at Josai International University in April, 2016. Factor analysis and multiple-group structural equation modeling were conducted for data analysis. A total of 165 students participated. The learning motivation profile was modeled with 4 factors (intrinsic, career, self-determination, and grade motivation), and the most effective learning motivation was grade motivation. In the multiple-group analysis, the fit of the model with the data was acceptable, and the estimated mean value of the factor of 'self-determination' in the learning motivation profile increased after the practical training programs (P= 0.048, Cohen's d = 0.43). Practical training programs in a 6-year course were effective for increasing learning motivation, based on 'self-determination' among Japanese pharmacy students. The results suggest that practical training programs are meaningful not only for providing clinical experience but also for raising learning motivation.

  12. Meta-Analysis in Higher Education: An Illustrative Example Using Hierarchical Linear Modeling

    ERIC Educational Resources Information Center

    Denson, Nida; Seltzer, Michael H.

    2011-01-01

    The purpose of this article is to provide higher education researchers with an illustrative example of meta-analysis utilizing hierarchical linear modeling (HLM). This article demonstrates the step-by-step process of meta-analysis using a recently-published study examining the effects of curricular and co-curricular diversity activities on racial…

  13. A Study on Standard Competition with Network Effect Based on Evolutionary Game Model

    NASA Astrophysics Data System (ADS)

    Wang, Ye; Wang, Bingdong; Li, Kangning

    Owing to networks widespread in modern society, standard competition with network effect is now endowed with new connotation. This paper aims to study the impact of network effect on standard competition; it is organized in the mode of "introduction-model setup-equilibrium analysis-conclusion". Starting from a well-structured model of evolutionary game, it is then extended to a dynamic analysis. This article proves both theoretically and empirically that whether or not a standard can lead the market trends depends on the utility it would bring, and the author also discusses some advisable strategies revolving around the two factors of initial position and border break.

  14. Continuum modeling of three-dimensional truss-like space structures

    NASA Technical Reports Server (NTRS)

    Nayfeh, A. H.; Hefzy, M. S.

    1978-01-01

    A mathematical and computational analysis capability has been developed for calculating the effective mechanical properties of three-dimensional periodic truss-like structures. Two models are studied in detail. The first, called the octetruss model, is a three-dimensional extension of a two-dimensional model, and the second is a cubic model. Symmetry considerations are employed as a first step to show that the specific octetruss model has four independent constants and that the cubic model has two. The actual values of these constants are determined by averaging the contributions of each rod element to the overall structure stiffness. The individual rod member contribution to the overall stiffness is obtained by a three-dimensional coordinate transformation. The analysis shows that the effective three-dimensional elastic properties of both models are relatively close to each other.

  15. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  16. Developing a model for effective leadership in healthcare: a concept mapping approach.

    PubMed

    Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison Mb; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C

    2017-01-01

    Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group's ideas) to identify stakeholders' mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were "Acting with Personal Integrity", "Communicating Effectively", "Acting with Professional Ethical Values", "Pursuing Excellence", "Building and Maintaining Relationships", and "Thinking Critically". Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research.

  17. Models for residential-and commercial-sector energy conservation analysis: Applications, limitations, and future potential

    NASA Astrophysics Data System (ADS)

    Cole, H. E.; Fuller, R. E.

    1980-09-01

    Four of the major models used by DOE for energy conservation analyses in the residential and commercial building sectors are reviewed and critically analyzed to determine how these models can serve as tools for DOE and its Conservation Policy Office in evaluating and quantifying their policy and program requirements. The most effective role for each model in addressing future issues of buildings energy conservation policy and analysis is assessed. The four models covered are: Oak Ridge Residential Energy Model; Micro Analysis of Transfers to Households/Comprehensive Human Resources Data System (MATH/CHRDS) Model; Oak Ridge Commercial Energy Model; and Brookhaven Buildings Energy Conservation Optimization Model (BECOM).

  18. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  19. Two-field analysis of no-scale supergravity inflation

    DOE PAGES

    Ellis, John; Garcia, Marcos A. G.; Nanopoulos, Dimitri V.; ...

    2015-01-08

    Since the building-blocks of supersymmetric models include chiral superfields containing pairs of effective scalar fields, a two-field approach is particularly appropriate for models of inflation based on supergravity. In this paper, we generalize the two-field analysis of the inflationary power spectrum to supergravity models with arbitrary Kähler potential. We show how two-field effects in the context of no-scale supergravity can alter the model predictions for the scalar spectral index n s and the tensor-to-scalar ratio r, yielding results that interpolate between the Planck-friendly Starobinsky model and BICEP2-friendly predictions. In particular, we show that two-field effects in a chaotic no-scale inflationmore » model with a quadratic potential are capable of reducing r to very small values << 0.1. Here, we also calculate the non-Gaussianity measure f NL, finding that is well below the current experimental sensitivity.« less

  20. Penalized discriminant analysis for the detection of wild-grown and cultivated Ganoderma lucidum using Fourier transform infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhu, Ying; Tan, Tuck Lee

    2016-04-01

    An effective and simple analytical method using Fourier transform infrared (FTIR) spectroscopy to distinguish wild-grown high-quality Ganoderma lucidum (G. lucidum) from cultivated one is of essential importance for its quality assurance and medicinal value estimation. Commonly used chemical and analytical methods using full spectrum are not so effective for the detection and interpretation due to the complex system of the herbal medicine. In this study, two penalized discriminant analysis models, penalized linear discriminant analysis (PLDA) and elastic net (Elnet),using FTIR spectroscopy have been explored for the purpose of discrimination and interpretation. The classification performances of the two penalized models have been compared with two widely used multivariate methods, principal component discriminant analysis (PCDA) and partial least squares discriminant analysis (PLSDA). The Elnet model involving a combination of L1 and L2 norm penalties enabled an automatic selection of a small number of informative spectral absorption bands and gave an excellent classification accuracy of 99% for discrimination between spectra of wild-grown and cultivated G. lucidum. Its classification performance was superior to that of the PLDA model in a pure L1 setting and outperformed the PCDA and PLSDA models using full wavelength. The well-performed selection of informative spectral features leads to substantial reduction in model complexity and improvement of classification accuracy, and it is particularly helpful for the quantitative interpretations of the major chemical constituents of G. lucidum regarding its anti-cancer effects.

  1. Selection of higher order regression models in the analysis of multi-factorial transcription data.

    PubMed

    Prazeres da Costa, Olivia; Hoffman, Arthur; Rey, Johannes W; Mansmann, Ulrich; Buch, Thorsten; Tresch, Achim

    2014-01-01

    Many studies examine gene expression data that has been obtained under the influence of multiple factors, such as genetic background, environmental conditions, or exposure to diseases. The interplay of multiple factors may lead to effect modification and confounding. Higher order linear regression models can account for these effects. We present a new methodology for linear model selection and apply it to microarray data of bone marrow-derived macrophages. This experiment investigates the influence of three variable factors: the genetic background of the mice from which the macrophages were obtained, Yersinia enterocolitica infection (two strains, and a mock control), and treatment/non-treatment with interferon-γ. We set up four different linear regression models in a hierarchical order. We introduce the eruption plot as a new practical tool for model selection complementary to global testing. It visually compares the size and significance of effect estimates between two nested models. Using this methodology we were able to select the most appropriate model by keeping only relevant factors showing additional explanatory power. Application to experimental data allowed us to qualify the interaction of factors as either neutral (no interaction), alleviating (co-occurring effects are weaker than expected from the single effects), or aggravating (stronger than expected). We find a biologically meaningful gene cluster of putative C2TA target genes that appear to be co-regulated with MHC class II genes. We introduced the eruption plot as a tool for visual model comparison to identify relevant higher order interactions in the analysis of expression data obtained under the influence of multiple factors. We conclude that model selection in higher order linear regression models should generally be performed for the analysis of multi-factorial microarray data.

  2. To sling or not to sling at time of abdominal sacrocolpopexy: a cost-effectiveness analysis.

    PubMed

    Richardson, Monica L; Elliott, Christopher S; Shaw, Jonathan G; Comiter, Craig V; Chen, Bertha; Sokol, Eric R

    2013-10-01

    We compare the cost-effectiveness of 3 strategies for the use of a mid urethral sling to prevent occult stress urinary incontinence in patients undergoing abdominal sacrocolpopexy. Using decision analysis modeling we compared cost-effectiveness during a 1-year postoperative period of 3 treatment approaches including 1) abdominal sacrocolpopexy alone with deferred option for mid urethral sling, 2) abdominal sacrocolpopexy with universal concomitant mid urethral sling and 3) preoperative urodynamic study for selective mid urethral sling. Using published data we modeled probabilities of stress urinary incontinence after abdominal sacrocolpopexy with or without mid urethral sling, the predictive value of urodynamic study to detect occult stress urinary incontinence and the likelihood of complications after mid urethral sling. Costs were derived from Medicare 2010 reimbursement rates. The main outcome modeled was incremental cost-effectiveness ratio per quality adjusted life-years gained. In addition to base case analysis, 1-way sensitivity analyses were performed. In our model, universally performing mid urethral sling at abdominal sacrocolpopexy was the most cost-effective approach with an incremental cost per quality adjusted life-year gained of $2,867 compared to abdominal sacrocolpopexy alone. Preoperative urodynamic study was more costly and less effective than universally performing intraoperative mid urethral sling. The cost-effectiveness of abdominal sacrocolpopexy plus mid urethral sling was robust to sensitivity analysis with a cost-effectiveness ratio consistently below $20,000 per quality adjusted life-year. Universal concomitant mid urethral sling is the most cost-effective prophylaxis strategy for occult stress urinary incontinence in women undergoing abdominal sacrocolpopexy. The use of preoperative urodynamic study to guide mid urethral sling placement at abdominal sacrocolpopexy is not cost-effective. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  3. Thermodynamic modeling of transcription: sensitivity analysis differentiates biological mechanism from mathematical model-induced effects.

    PubMed

    Dresch, Jacqueline M; Liu, Xiaozhou; Arnosti, David N; Ay, Ahmet

    2010-10-24

    Quantitative models of gene expression generate parameter values that can shed light on biological features such as transcription factor activity, cooperativity, and local effects of repressors. An important element in such investigations is sensitivity analysis, which determines how strongly a model's output reacts to variations in parameter values. Parameters of low sensitivity may not be accurately estimated, leading to unwarranted conclusions. Low sensitivity may reflect the nature of the biological data, or it may be a result of the model structure. Here, we focus on the analysis of thermodynamic models, which have been used extensively to analyze gene transcription. Extracted parameter values have been interpreted biologically, but until now little attention has been given to parameter sensitivity in this context. We apply local and global sensitivity analyses to two recent transcriptional models to determine the sensitivity of individual parameters. We show that in one case, values for repressor efficiencies are very sensitive, while values for protein cooperativities are not, and provide insights on why these differential sensitivities stem from both biological effects and the structure of the applied models. In a second case, we demonstrate that parameters that were thought to prove the system's dependence on activator-activator cooperativity are relatively insensitive. We show that there are numerous parameter sets that do not satisfy the relationships proferred as the optimal solutions, indicating that structural differences between the two types of transcriptional enhancers analyzed may not be as simple as altered activator cooperativity. Our results emphasize the need for sensitivity analysis to examine model construction and forms of biological data used for modeling transcriptional processes, in order to determine the significance of estimated parameter values for thermodynamic models. Knowledge of parameter sensitivities can provide the necessary context to determine how modeling results should be interpreted in biological systems.

  4. Analysis of point-to-point lung motion with full inspiration and expiration CT data using non-linear optimization method: optimal geometric assumption model for the effective registration algorithm

    NASA Astrophysics Data System (ADS)

    Kim, Namkug; Seo, Joon Beom; Heo, Jeong Nam; Kang, Suk-Ho

    2007-03-01

    The study was conducted to develop a simple model for more robust lung registration of volumetric CT data, which is essential for various clinical lung analysis applications, including the lung nodule matching in follow up CT studies, semi-quantitative assessment of lung perfusion, and etc. The purpose of this study is to find the most effective reference point and geometric model based on the lung motion analysis from the CT data sets obtained in full inspiration (In.) and expiration (Ex.). Ten pairs of CT data sets in normal subjects obtained in full In. and Ex. were used in this study. Two radiologists were requested to draw 20 points representing the subpleural point of the central axis in each segment. The apex, hilar point, and center of inertia (COI) of each unilateral lung were proposed as the reference point. To evaluate optimal expansion point, non-linear optimization without constraints was employed. The objective function is sum of distances from the line, consist of the corresponding points between In. and Ex. to the optimal point x. By using the nonlinear optimization, the optimal points was evaluated and compared between reference points. The average distance between the optimal point and each line segment revealed that the balloon model was more suitable to explain the lung expansion model. This lung motion analysis based on vector analysis and non-linear optimization shows that balloon model centered on the center of inertia of lung is most effective geometric model to explain lung expansion by breathing.

  5. Responsive copolymers for enhanced petroleum recovery. Quarterly technical progress report, June 23--September 21, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCormick, C.; Hester, R.

    Summaries are given on the technical progress on three tasks of this project. Monomer and polymer synthesis discusses the preparation of 1(7-aminoheptyloxymethyl)naphthalene and poly(maleic anhydride-alt-ethyl vinyl ether). Task 2, Characterization of molecular structure, discusses terpolymer solution preparation, UV analysis, fluorescence analysis, low angle laser light scattering, and viscometry. The paper discusses the effects of hydrophobic groups, the effect of pH, the effect of electrolyte addition, and photophysical studies. Task 3, Solution properties, describes the factorial experimental design for characterizing polymer solutions by light scattering, the light scattering test model, orthogonal factorial test design, linear regression in coded space, confidence levelmore » for coded space test mode coefficients, coefficients of the real space test model, and surface analysis of the model equations.« less

  6. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  7. An Analysis of Fixed Wing Tactical Airlifter Characteristics Using an Intra-Theater Airlift Computer Model

    DTIC Science & Technology

    1991-09-01

    an Experimental Design ...... 31 Selection of Variables .................... ... 34 Defining Measures of Effectiveness ....... 37 Specification of...Required Number of Replications 44 Modification of Scenario Files ......... ... 46 Analysis of the Main Effects of a Two Level Factorial Design ...48 Analysis of the Interaction Effects of a *Two Level Factorial Design .. ............. ... 49 Yate’s Algorithm ......... ................ 50

  8. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  9. Bayesian Covariate Selection in Mixed-Effects Models For Longitudinal Shape Analysis

    PubMed Central

    Muralidharan, Prasanna; Fishbaugh, James; Kim, Eun Young; Johnson, Hans J.; Paulsen, Jane S.; Gerig, Guido; Fletcher, P. Thomas

    2016-01-01

    The goal of longitudinal shape analysis is to understand how anatomical shape changes over time, in response to biological processes, including growth, aging, or disease. In many imaging studies, it is also critical to understand how these shape changes are affected by other factors, such as sex, disease diagnosis, IQ, etc. Current approaches to longitudinal shape analysis have focused on modeling age-related shape changes, but have not included the ability to handle covariates. In this paper, we present a novel Bayesian mixed-effects shape model that incorporates simultaneous relationships between longitudinal shape data and multiple predictors or covariates to the model. Moreover, we place an Automatic Relevance Determination (ARD) prior on the parameters, that lets us automatically select which covariates are most relevant to the model based on observed data. We evaluate our proposed model and inference procedure on a longitudinal study of Huntington's disease from PREDICT-HD. We first show the utility of the ARD prior for model selection in a univariate modeling of striatal volume, and next we apply the full high-dimensional longitudinal shape model to putamen shapes. PMID:28090246

  10. Regression analysis using dependent Polya trees.

    PubMed

    Schörgendorfer, Angela; Branscum, Adam J

    2013-11-30

    Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Aurally-adequate time-frequency analysis for scattered sound in auditoria

    NASA Astrophysics Data System (ADS)

    Norris, Molly K.; Xiang, Ning; Kleiner, Mendel

    2005-04-01

    The goal of this work was to apply an aurally-adequate time-frequency analysis technique to the analysis of sound scattering effects in auditoria. Time-frequency representations were developed as a motivated effort that takes into account binaural hearing, with a specific implementation of interaural cross-correlation process. A model of the human auditory system was implemented in the MATLAB platform based on two previous models [A. Härmä and K. Palomäki, HUTear, Espoo, Finland; and M. A. Akeroyd, A. Binaural Cross-correlogram Toolbox for MATLAB (2001), University of Sussex, Brighton]. These stages include proper frequency selectivity, the conversion of the mechanical motion of the basilar membrane to neural impulses, and binaural hearing effects. The model was then used in the analysis of room impulse responses with varying scattering characteristics. This paper discusses the analysis results using simulated and measured room impulse responses. [Work supported by the Frank H. and Eva B. Buck Foundation.

  12. Cost-effectiveness of orthoptic screening in kindergarten: a decision-analytic model.

    PubMed

    König, H H; Barry, J C; Leidl, R; Zrenner, E

    2000-06-01

    The purpose of this study was to analyze the cost-effectiveness of orthoptic screening for amblyopia in kindergarten. A decision-analytic model was used. In this model all kindergarten children in Germany aged 3 years were examined by an orthoptist. Children with positive screening results were referred to an ophthalmologist for diagnosis. The number of newly diagnosed cases of amblyopia, amblyogenic non-obvious strabismus and amblyogenic refractive errors was used as the measure of effectiveness. Direct costs were measured form a third-party payer perspective. Data for model parameters were obtained from the literature and from own measurements in kindergartens. A base analysis was performed using median parameter values. The influence of uncertain parameters was tested in sensitivity analyses. According to the base analysis, the cost of one orthoptic screening test was 7.87 euro. One ophthalmologic examination cost 36.40 euro. The total cost of the screening program in all kindergartens was 3.1 million euro. A total of 4,261 new cases would be detected. The cost-effectiveness ratio was 727 euro per case detected. Sensitivity analysis showed considerable influence of the prevalence rate of target conditions and of the specificity of the orthopic examination on the cost-effectiveness ratio. This analysis provides information which is useful for discussion about the implementation of orthoptic screening and for planning a field study.

  13. FMCSA Safety Program Effectiveness Measurement: Carrier Intervention Effectiveness Model, Version 1.1-Report for FY 2014 Interventions - Analysis Brief

    DOT National Transportation Integrated Search

    2018-04-01

    The Carrier Intervention Effectiveness Model (CIEM) provides the Federal Motor Carrier Safety Administration (FMCSA) with a tool for measuring the safety benefits of carrier interventions conducted under the Compliance, Safety, Accountability (CSA) e...

  14. FMCSA safety program effectiveness measurement : carrier intervention effectiveness model, version 1.1 - report for FY 2013 interventions : analysis brief

    DOT National Transportation Integrated Search

    2017-04-01

    The Carrier Intervention Effectiveness Model (CIEM) provides the Federal Motor Carrier Safety Administration (FMCSA) with a tool for measuring the safety benefits of carrier interventions conducted under the Compliance, Safety, Accountability (CSA) e...

  15. Isolating the anthropogenic component of Arctic warming

    DOE PAGES

    Chylek, Petr; Hengartner, Nicholas; Lesins, Glen; ...

    2014-05-28

    Structural equation modeling is used in statistical applications as both confirmatory and exploratory modeling to test models and to suggest the most plausible explanation for a relationship between the independent and the dependent variables. Although structural analysis cannot prove causation, it can suggest the most plausible set of factors that influence the observed variable. Here, we apply structural model analysis to the annual mean Arctic surface air temperature from 1900 to 2012 to find the most effective set of predictors and to isolate the anthropogenic component of the recent Arctic warming by subtracting the effects of natural forcing and variabilitymore » from the observed temperature. We also find that anthropogenic greenhouse gases and aerosols radiative forcing and the Atlantic Multidecadal Oscillation internal mode dominate Arctic temperature variability. Finally, our structural model analysis of observational data suggests that about half of the recent Arctic warming of 0.64 K/decade may have anthropogenic causes.« less

  16. Aircraft High-Lift Aerodynamic Analysis Using a Surface-Vorticity Solver

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.; Albertson, Cindy W.

    2016-01-01

    This study extends an existing semi-empirical approach to high-lift analysis by examining its effectiveness for use with a three-dimensional aerodynamic analysis method. The aircraft high-lift geometry is modeled in Vehicle Sketch Pad (OpenVSP) using a newly-developed set of techniques for building a three-dimensional model of the high-lift geometry, and for controlling flap deflections using scripted parameter linking. Analysis of the low-speed aerodynamics is performed in FlightStream, a novel surface-vorticity solver that is expected to be substantially more robust and stable compared to pressure-based potential-flow solvers and less sensitive to surface perturbations. The calculated lift curve and drag polar are modified by an empirical lift-effectiveness factor that takes into account the effects of viscosity that are not captured in the potential-flow solution. Analysis results are validated against wind-tunnel data for The Energy-Efficient Transport AR12 low-speed wind-tunnel model, a 12-foot, full-span aircraft configuration with a supercritical wing, full-span slats, and part-span double-slotted flaps.

  17. Social comparison and perceived breach of psychological contract: their effects on burnout in a multigroup analysis.

    PubMed

    Cantisano, Gabriela Topa; Domínguez, J Francisco Morales; García, J Luis Caeiro

    2007-05-01

    This study focuses on the mediator role of social comparison in the relationship between perceived breach of psychological contract and burnout. A previous model showing the hypothesized effects of perceived breach on burnout, both direct and mediated, is proposed. The final model reached an optimal fit to the data and was confirmed through multigroup analysis using a sample of Spanish teachers (N = 401) belonging to preprimary, primary, and secondary schools. Multigroup analyses showed that the model fit all groups adequately.

  18. "Home Made" Model to Study the Greenhouse Effect and Global Warming

    ERIC Educational Resources Information Center

    Onorato, P.; Mascheretti, P.; DeAmbrosis, A.

    2011-01-01

    In this paper a simplified two-parameter model of the greenhouse effect on the Earth is developed, starting from the well known two-layer model. It allows both the analysis of the temperatures of the inner planets, by focusing on the role of the greenhouse effect, and a comparison between the temperatures the planets should have in the absence of…

  19. A COST-EFFECTIVENESS MODEL FOR THE ANALYSIS OF TITLE I ESEA PROJECT PROPOSALS, PART I-VII.

    ERIC Educational Resources Information Center

    ABT, CLARK C.

    SEVEN SEPARATE REPORTS DESCRIBE AN OVERVIEW OF A COST-EFFECTIVENESS MODEL AND FIVE SUBMODELS FOR EVALUATING THE EFFECTIVENESS OF ELEMENTARY AND SECONDARY ACT TITLE I PROPOSALS. THE DESIGN FOR THE MODEL ATTEMPTS A QUANTITATIVE DESCRIPTION OF EDUCATION SYSTEMS WHICH MAY BE PROGRAMED AS A COMPUTER SIMULATION TO INDICATE THE IMPACT OF A TITLE I…

  20. Direct and Indirect Effects of Parental Influence upon Adolescent Alcohol Use: A Structural Equation Modeling Analysis

    ERIC Educational Resources Information Center

    Kim, Young-Mi; Neff, James Alan

    2010-01-01

    A model incorporating the direct and indirect effects of parental monitoring on adolescent alcohol use was evaluated by applying structural equation modeling (SEM) techniques to data on 4,765 tenth-graders in the 2001 Monitoring the Future Study. Analyses indicated good fit of hypothesized measurement and structural models. Analyses supported both…

  1. Correspondence between Traditional Models of Functional Analysis and a Functional Analysis of Manding Behavior

    ERIC Educational Resources Information Center

    LaRue, Robert H.; Sloman, Kimberly N.; Weiss, Mary Jane; Delmolino, Lara; Hansford, Amy; Szalony, Jill; Madigan, Ryan; Lambright, Nathan M.

    2011-01-01

    Functional analysis procedures have been effectively used to determine the maintaining variables for challenging behavior and subsequently develop effective interventions. However, fear of evoking dangerous topographies of maladaptive behavior and concerns for reinforcing infrequent maladaptive behavior present challenges for people working in…

  2. Alternative Measures of Between-Study Heterogeneity in Meta-Analysis: Reducing the Impact of Outlying Studies

    PubMed Central

    Lin, Lifeng; Chu, Haitao; Hodges, James S.

    2016-01-01

    Summary Meta-analysis has become a widely used tool to combine results from independent studies. The collected studies are homogeneous if they share a common underlying true effect size; otherwise, they are heterogeneous. A fixed-effect model is customarily used when the studies are deemed homogeneous, while a random-effects model is used for heterogeneous studies. Assessing heterogeneity in meta-analysis is critical for model selection and decision making. Ideally, if heterogeneity is present, it should permeate the entire collection of studies, instead of being limited to a small number of outlying studies. Outliers can have great impact on conventional measures of heterogeneity and the conclusions of a meta-analysis. However, no widely accepted guidelines exist for handling outliers. This article proposes several new heterogeneity measures. In the presence of outliers, the proposed measures are less affected than the conventional ones. The performance of the proposed and conventional heterogeneity measures are compared theoretically, by studying their asymptotic properties, and empirically, using simulations and case studies. PMID:27167143

  3. Multivariate Longitudinal Analysis with Bivariate Correlation Test

    PubMed Central

    Adjakossa, Eric Houngla; Sadissou, Ibrahim; Hounkonnou, Mahouton Norbert; Nuel, Gregory

    2016-01-01

    In the context of multivariate multilevel data analysis, this paper focuses on the multivariate linear mixed-effects model, including all the correlations between the random effects when the dimensional residual terms are assumed uncorrelated. Using the EM algorithm, we suggest more general expressions of the model’s parameters estimators. These estimators can be used in the framework of the multivariate longitudinal data analysis as well as in the more general context of the analysis of multivariate multilevel data. By using a likelihood ratio test, we test the significance of the correlations between the random effects of two dependent variables of the model, in order to investigate whether or not it is useful to model these dependent variables jointly. Simulation studies are done to assess both the parameter recovery performance of the EM estimators and the power of the test. Using two empirical data sets which are of longitudinal multivariate type and multivariate multilevel type, respectively, the usefulness of the test is illustrated. PMID:27537692

  4. Atmospheric model development in support of SEASAT. Volume 1: Summary of findings

    NASA Technical Reports Server (NTRS)

    Kesel, P. G.

    1977-01-01

    Atmospheric analysis and prediction models of varying (grid) resolution were developed. The models were tested using real observational data for the purpose of assessing the impact of grid resolution on short range numerical weather prediction. The discretionary model procedures were examined so that the computational viability of SEASAT data might be enhanced during the conduct of (future) sensitivity tests. The analysis effort covers: (1) examining the procedures for allowing data to influence the analysis; (2) examining the effects of varying the weights in the analysis procedure; (3) testing and implementing procedures for solving the minimization equation in an optimal way; (4) describing the impact of grid resolution on analysis; and (5) devising and implementing numerous practical solutions to analysis problems, generally.

  5. Design and analysis of forward and reverse models for predicting defect accumulation, defect energetics, and irradiation conditions

    DOE PAGES

    Stewart, James A.; Kohnert, Aaron A.; Capolungo, Laurent; ...

    2018-03-06

    The complexity of radiation effects in a material’s microstructure makes developing predictive models a difficult task. In principle, a complete list of all possible reactions between defect species being considered can be used to elucidate damage evolution mechanisms and its associated impact on microstructure evolution. However, a central limitation is that many models use a limited and incomplete catalog of defect energetics and associated reactions. Even for a given model, estimating its input parameters remains a challenge, especially for complex material systems. Here, we present a computational analysis to identify the extent to which defect accumulation, energetics, and irradiation conditionsmore » can be determined via forward and reverse regression models constructed and trained from large data sets produced by cluster dynamics simulations. A global sensitivity analysis, via Sobol’ indices, concisely characterizes parameter sensitivity and demonstrates how this can be connected to variability in defect evolution. Based on this analysis and depending on the definition of what constitutes the input and output spaces, forward and reverse regression models are constructed and allow for the direct calculation of defect accumulation, defect energetics, and irradiation conditions. Here, this computational analysis, exercised on a simplified cluster dynamics model, demonstrates the ability to design predictive surrogate and reduced-order models, and provides guidelines for improving model predictions within the context of forward and reverse engineering of mathematical models for radiation effects in a materials’ microstructure.« less

  6. Design and analysis of forward and reverse models for predicting defect accumulation, defect energetics, and irradiation conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, James A.; Kohnert, Aaron A.; Capolungo, Laurent

    The complexity of radiation effects in a material’s microstructure makes developing predictive models a difficult task. In principle, a complete list of all possible reactions between defect species being considered can be used to elucidate damage evolution mechanisms and its associated impact on microstructure evolution. However, a central limitation is that many models use a limited and incomplete catalog of defect energetics and associated reactions. Even for a given model, estimating its input parameters remains a challenge, especially for complex material systems. Here, we present a computational analysis to identify the extent to which defect accumulation, energetics, and irradiation conditionsmore » can be determined via forward and reverse regression models constructed and trained from large data sets produced by cluster dynamics simulations. A global sensitivity analysis, via Sobol’ indices, concisely characterizes parameter sensitivity and demonstrates how this can be connected to variability in defect evolution. Based on this analysis and depending on the definition of what constitutes the input and output spaces, forward and reverse regression models are constructed and allow for the direct calculation of defect accumulation, defect energetics, and irradiation conditions. Here, this computational analysis, exercised on a simplified cluster dynamics model, demonstrates the ability to design predictive surrogate and reduced-order models, and provides guidelines for improving model predictions within the context of forward and reverse engineering of mathematical models for radiation effects in a materials’ microstructure.« less

  7. Error Analysis for High Resolution Topography with Bi-Static Single-Pass SAR Interferometry

    NASA Technical Reports Server (NTRS)

    Muellerschoen, Ronald J.; Chen, Curtis W.; Hensley, Scott; Rodriguez, Ernesto

    2006-01-01

    We present a flow down error analysis from the radar system to topographic height errors for bi-static single pass SAR interferometry for a satellite tandem pair. Because of orbital dynamics the baseline length and baseline orientation evolve spatially and temporally, the height accuracy of the system is modeled as a function of the spacecraft position and ground location. Vector sensitivity equations of height and the planar error components due to metrology, media effects, and radar system errors are derived and evaluated globally for a baseline mission. Included in the model are terrain effects that contribute to layover and shadow and slope effects on height errors. The analysis also accounts for nonoverlapping spectra and the non-overlapping bandwidth due to differences between the two platforms' viewing geometries. The model is applied to a 514 km altitude 97.4 degree inclination tandem satellite mission with a 300 m baseline separation and X-band SAR. Results from our model indicate that global DTED level 3 can be achieved.

  8. A Novel Ontology Approach to Support Design for Reliability considering Environmental Effects

    PubMed Central

    Sun, Bo; Li, Yu; Ye, Tianyuan

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis. PMID:25821857

  9. A novel ontology approach to support design for reliability considering environmental effects.

    PubMed

    Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.

  10. Functional Data Analysis in NTCP Modeling: A New Method to Explore the Radiation Dose-Volume Effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benadjaoud, Mohamed Amine, E-mail: mohamedamine.benadjaoud@gustaveroussy.fr; Université Paris sud, Le Kremlin-Bicêtre; Institut Gustave Roussy, Villejuif

    2014-11-01

    Purpose/Objective(s): To describe a novel method to explore radiation dose-volume effects. Functional data analysis is used to investigate the information contained in differential dose-volume histograms. The method is applied to the normal tissue complication probability modeling of rectal bleeding (RB) for patients irradiated in the prostatic bed by 3-dimensional conformal radiation therapy. Methods and Materials: Kernel density estimation was used to estimate the individual probability density functions from each of the 141 rectum differential dose-volume histograms. Functional principal component analysis was performed on the estimated probability density functions to explore the variation modes in the dose distribution. The functional principalmore » components were then tested for association with RB using logistic regression adapted to functional covariates (FLR). For comparison, 3 other normal tissue complication probability models were considered: the Lyman-Kutcher-Burman model, logistic model based on standard dosimetric parameters (LM), and logistic model based on multivariate principal component analysis (PCA). Results: The incidence rate of grade ≥2 RB was 14%. V{sub 65Gy} was the most predictive factor for the LM (P=.058). The best fit for the Lyman-Kutcher-Burman model was obtained with n=0.12, m = 0.17, and TD50 = 72.6 Gy. In PCA and FLR, the components that describe the interdependence between the relative volumes exposed at intermediate and high doses were the most correlated to the complication. The FLR parameter function leads to a better understanding of the volume effect by including the treatment specificity in the delivered mechanistic information. For RB grade ≥2, patients with advanced age are significantly at risk (odds ratio, 1.123; 95% confidence interval, 1.03-1.22), and the fits of the LM, PCA, and functional principal component analysis models are significantly improved by including this clinical factor. Conclusion: Functional data analysis provides an attractive method for flexibly estimating the dose-volume effect for normal tissues in external radiation therapy.« less

  11. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  12. The revelation effect: A meta-analytic test of hypotheses.

    PubMed

    Aßfalg, André; Bernstein, Daniel M; Hockley, William

    2017-12-01

    Judgments can depend on the activity directly preceding them. An example is the revelation effect whereby participants are more likely to claim that a stimulus is familiar after a preceding task, such as solving an anagram, than without a preceding task. We test conflicting predictions of four revelation-effect hypotheses in a meta-analysis of 26 years of revelation-effect research. The hypotheses' predictions refer to three subject areas: (1) the basis of judgments that are subject to the revelation effect (recollection vs. familiarity vs. fluency), (2) the degree of similarity between the task and test item, and (3) the difficulty of the preceding task. We use a hierarchical multivariate meta-analysis to account for dependent effect sizes and variance in experimental procedures. We test the revelation-effect hypotheses with a model selection procedure, where each model corresponds to a prediction of a revelation-effect hypothesis. We further quantify the amount of evidence for one model compared to another with Bayes factors. The results of this analysis suggest that none of the extant revelation-effect hypotheses can fully account for the data. The general vagueness of revelation-effect hypotheses and the scarcity of data were the major limiting factors in our analyses, emphasizing the need for formalized theories and further research into the puzzling revelation effect.

  13. THE INFLUENCE OF MODEL TIME STEP ON THE RELATIVE SENSITIVITY OF POPULATION GROWTH TO SURVIVAL, GROWTH AND REPRODUCTION

    EPA Science Inventory

    Matrix population models are often used to extrapolate from life stage-specific stressor effects on survival and reproduction to population-level effects. Demographic elasticity analysis of a matrix model allows an evaluation of the relative sensitivity of population growth rate ...

  14. Learning Molecular Behaviour May Improve Student Explanatory Models of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Harris, Sara E.; Gold, Anne U.

    2018-01-01

    We assessed undergraduates' representations of the greenhouse effect, based on student-generated concept sketches, before and after a 30-min constructivist lesson. Principal component analysis of features in student sketches revealed seven distinct and coherent explanatory models including a new "Molecular Details" model. After the…

  15. Linear Instability Analysis of non-uniform Bubbly Mixing layer with Two-Fluid model

    NASA Astrophysics Data System (ADS)

    Sharma, Subash; Chetty, Krishna; Lopez de Bertodano, Martin

    We examine the inviscid instability of a non-uniform adiabatic bubbly shear layer with a Two-Fluid model. The Two-Fluid model is made well-posed with the closure relations for interfacial forces. First, a characteristic analysis is carried out to study the well posedness of the model over range of void fraction with interfacial forces for virtual mass, interfacial drag, interfacial pressure. A dispersion analysis then allow us to obtain growth rate and wavelength. Then, the well-posed two-fluid model is solved using CFD to validate the results obtained with the linear stability analysis. The effect of the void fraction and the distribution profile on stability is analyzed.

  16. Box truss analysis and technology development. Task 1: Mesh analysis and control

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.

    1985-01-01

    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.

  17. Long non-coding RNA CCAT1 as a diagnostic and prognostic molecular marker in various cancers: a meta-analysis.

    PubMed

    Zhang, Zhihui; Xie, Haibiao; Liang, Daqiang; Huang, Lanbing; Liang, Feiguo; Qi, Qiang; Yang, Xinjian

    2018-05-04

    Long non-coding RNA colon cancer-associated transcript-1 (CCAT1) is newly found to be related with diagnoses and prognosis of cancer. This meta-analysis was performed to investigate the relationship between CCAT1 expression and clinical parameters, including survival condition, lymph node metastasis and tumor node metastasis grade. The primary literatures were collected through initial search criteria from electronic databases, including PubMed, OVID Evidence-based medicine Reviews and others (up to May 12, 2017). Eligible studies were identified and selected by the inclusion and exclusion criteria. Data was extracted and computed into Hazard ratio (HR) for the assessment of overall survival, subgroup analyses were prespecified based on the digestive tract cancer or others. Analysis of different CCAT1 expression related with lymph node metastasis or tumor node metastasis grade was conducted. Risk of bias was assessed by the Newcastle-Ottawa Scale. 9 studies were included. This meta-analysis showed that high CCAT1 expression level was related to poor overall survival, the pooled HR was 2.42 (95% confidence interval, CI: 1.86-3.16; P < 0.001; fix- effects model), similarly in the cancer type subgroups: digestive tract cancer (HR, 2.42; 95% CI, 1.79-3.29; P < 0.001; fix- effects model) and others (HR, 2.42; 95% CI, 1.42-4.13; P = 0.001; fix- effects model). The analysis showed that high CCAT1 was strongly related to positive lymph node metastasis (Odds ratio, OR: 3.24; 95% CI, 2.04-5.16; P < 0.001; fix- effects model), high tumor node metastasis stage (OR, 3.87; 95% CI, 2.53-5.92; P < 0.001; fix- effects model). In conclusion, this meta-analysis revealed that CCAT1 had potential as a diagnostic and prognostic biomarker in various cancers.

  18. Dynaflow User’s Guide

    DTIC Science & Technology

    1988-11-01

    264 ANALYSIS RESTART. ............. ..... ....... 269 1.0 TITLE CARD. .............. ............. 271 2.0 CONTROL CARDS...stress soil model will provide a tool for such analysis of waterfront structures. To understand the significance of liquefaction, it is important to note...Implementing this effective stress soil model into a finite element computer program would allow analysis of soil and structure together. TECHNICAL BACKGROUND

  19. Effective moisture penetration depth model for residential buildings: Sensitivity analysis and guidance on model inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Jason; Winkler, Jon

    Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less

  20. Effective moisture penetration depth model for residential buildings: Sensitivity analysis and guidance on model inputs

    DOE PAGES

    Woods, Jason; Winkler, Jon

    2018-01-31

    Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less

  1. A model for sequential decoding overflow due to a noisy carrier reference. [communication performance prediction

    NASA Technical Reports Server (NTRS)

    Layland, J. W.

    1974-01-01

    An approximate analysis of the effect of a noisy carrier reference on the performance of sequential decoding is presented. The analysis uses previously developed techniques for evaluating noisy reference performance for medium-rate uncoded communications adapted to sequential decoding for data rates of 8 to 2048 bits/s. In estimating the ten to the minus fourth power deletion probability thresholds for Helios, the model agrees with experimental data to within the experimental tolerances. The computational problem involved in sequential decoding, carrier loop effects, the main characteristics of the medium-rate model, modeled decoding performance, and perspectives on future work are discussed.

  2. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    ERIC Educational Resources Information Center

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  3. A recessive genetic model and runs of homozygosity in major depressive disorder

    PubMed Central

    Power, Robert A.; Keller, Matthew C.; Ripke, Stephan; Abdellaoui, Abdel; Wray, Naomi R.; Sullivan, Patrick F; Breen, Gerome

    2014-01-01

    Genome-wide association studies (GWASs) of major depressive disorder (MDD) have yet to identify variants that surpass the threshold for genome-wide significance. A recent study reported that runs of homozygosity (ROH) are associated with schizophrenia, reflecting a novel genetic risk factor resulting from increased parental relatedness and recessive genetic effects. Here we undertake an analysis of ROH for MDD using the 9,238 MDD cases and 9,521 controls reported in a recent mega-analysis of 9 GWAS. Since evidence for association with ROH could reflect a recessive mode of action at loci, we also conducted a genome-wide association analyses under a recessive model. The genome-wide association analysis using a recessive model found no significant associations. Our analysis of ROH suggested that there was significant heterogeneity of effect across studies in effect (p=0.001), and it was associated with genotyping platform and country of origin. The results of the ROH analysis show that differences across studies can lead to conflicting systematic genome-wide differences between cases and controls that are unaccounted for by traditional covariates. They highlight the sensitivity of the ROH method to spurious associations, and the need to carefully control for potential confounds in such analyses. We found no strong evidence for a recessive model underlying MDD. PMID:24482242

  4. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  5. Valid statistical approaches for analyzing sholl data: Mixed effects versus simple linear models.

    PubMed

    Wilson, Machelle D; Sethi, Sunjay; Lein, Pamela J; Keil, Kimberly P

    2017-03-01

    The Sholl technique is widely used to quantify dendritic morphology. Data from such studies, which typically sample multiple neurons per animal, are often analyzed using simple linear models. However, simple linear models fail to account for intra-class correlation that occurs with clustered data, which can lead to faulty inferences. Mixed effects models account for intra-class correlation that occurs with clustered data; thus, these models more accurately estimate the standard deviation of the parameter estimate, which produces more accurate p-values. While mixed models are not new, their use in neuroscience has lagged behind their use in other disciplines. A review of the published literature illustrates common mistakes in analyses of Sholl data. Analysis of Sholl data collected from Golgi-stained pyramidal neurons in the hippocampus of male and female mice using both simple linear and mixed effects models demonstrates that the p-values and standard deviations obtained using the simple linear models are biased downwards and lead to erroneous rejection of the null hypothesis in some analyses. The mixed effects approach more accurately models the true variability in the data set, which leads to correct inference. Mixed effects models avoid faulty inference in Sholl analysis of data sampled from multiple neurons per animal by accounting for intra-class correlation. Given the widespread practice in neuroscience of obtaining multiple measurements per subject, there is a critical need to apply mixed effects models more widely. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Effect of analysis parameters on non-linear implicit finite element analysis of marine corroded steel plate

    NASA Astrophysics Data System (ADS)

    Islam, Muhammad Rabiul; Sakib-Ul-Alam, Md.; Nazat, Kazi Kaarima; Hassan, M. Munir

    2017-12-01

    FEA results greatly depend on analysis parameters. MSC NASTRAN nonlinear implicit analysis code has been used in large deformation finite element analysis of pitted marine SM490A steel rectangular plate. The effect of two types actual pit shape on parameters of integrity of structure has been analyzed. For 3-D modeling, a proposed method for simulation of pitted surface by probabilistic corrosion model has been used. The result has been verified with the empirical formula proposed by finite element analysis of steel surface generated with different pitted data where analyses have been carried out by the code of LS-DYNA 971. In the both solver, an elasto-plastic material has been used where an arbitrary stress versus strain curve can be defined. In the later one, the material model is based on the J2 flow theory with isotropic hardening where a radial return algorithm is used. The comparison shows good agreement between the two results which ensures successful simulation with comparatively less energy and time.

  7. Flutter parametric studies of cantilevered twin-engine transport type wing with and without winglet. Volume 2: Transonic and density effect investigations

    NASA Technical Reports Server (NTRS)

    Bhatia, K. G.; Nagaraja, K. S.

    1984-01-01

    Flutter characteristics of a cantilevered high aspect ratio wing with winglet were investigated. The configuration represented a current technology, twin engine airplane. Compressibility effects through transonic Mach numbers and a wide range of mass-density ratios were evaluated on a low speed and high speed model. Four flutter mechanisms were obtained from test, and analysis from various combinations of configuration parameters. It is shown that the coupling between wing tip vertical and chordwise motions have significant effect under some conditions. It is concluded that for the flutter model configurations studied, the winglet related flutter is amenable to the conventional flutter analysis techniques. The low speed model flutter and the high-speed model flutter results are described.

  8. Integrating FMEA in a Model-Driven Methodology

    NASA Astrophysics Data System (ADS)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  9. Systematic review, network meta-analysis and economic evaluation of biological therapy for the management of active psoriatic arthritis.

    PubMed

    Cawson, Matthew Richard; Mitchell, Stephen Andrew; Knight, Chris; Wildey, Henry; Spurden, Dean; Bird, Alex; Orme, Michelle Elaine

    2014-01-20

    An updated economic evaluation was conducted to compare the cost-effectiveness of the four tumour necrosis factor (TNF)-α inhibitors adalimumab, etanercept, golimumab and infliximab in active, progressive psoriatic arthritis (PsA) where response to standard treatment has been inadequate. A systematic review was conducted to identify relevant, recently published studies and the new trial data were synthesised, via a Bayesian network meta-analysis (NMA), to estimate the relative efficacy of the TNF-α inhibitors in terms of Psoriatic Arthritis Response Criteria (PsARC) response, Health Assessment Questionnaire (HAQ) scores and Psoriasis Area and Severity Index (PASI). A previously developed economic model was updated with the new meta-analysis results and current cost data. The model was adapted to delineate patients by PASI 50%, 75% and 90% response rates to differentiate between psoriasis outcomes. All four licensed TNF-α inhibitors were significantly more effective than placebo in achieving PsARC response in patients with active PsA. Adalimumab, etanercept and infliximab were significantly more effective than placebo in improving HAQ scores in patients who had achieved a PsARC response and in improving HAQ scores in PsARC non-responders. In an analysis using 1,000 model simulations, on average etanercept was the most cost-effective treatment and, at the National Institute for Health and Care Excellence willingness-to-pay threshold of between £20,000 to £30,000, etanercept is the preferred option. The economic analysis agrees with the conclusions from the previous models, in that biologics are shown to be cost-effective for treating patients with active PsA compared with the conventional management strategy. In particular, etanercept is cost-effective compared with the other biologic treatments.

  10. Network meta-analysis, electrical networks and graph theory.

    PubMed

    Rücker, Gerta

    2012-12-01

    Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Cost-effectiveness of population based BRCA testing with varying Ashkenazi Jewish ancestry.

    PubMed

    Manchanda, Ranjit; Patel, Shreeya; Antoniou, Antonis C; Levy-Lahad, Ephrat; Turnbull, Clare; Evans, D Gareth; Hopper, John L; Macinnis, Robert J; Menon, Usha; Jacobs, Ian; Legood, Rosa

    2017-11-01

    Population-based BRCA1/BRCA2 testing has been found to be cost-effective compared with family history-based testing in Ashkenazi-Jewish women were >30 years old with 4 Ashkenazi-Jewish grandparents. However, individuals may have 1, 2, or 3 Ashkenazi-Jewish grandparents, and cost-effectiveness data are lacking at these lower BRCA prevalence estimates. We present an updated cost-effectiveness analysis of population BRCA1/BRCA2 testing for women with 1, 2, and 3 Ashkenazi-Jewish grandparents. Decision analysis model. Lifetime costs and effects of population and family history-based testing were compared with the use of a decision analysis model. 56% BRCA carriers are missed by family history criteria alone. Analyses were conducted for United Kingdom and United States populations. Model parameters were obtained from the Genetic Cancer Prediction through Population Screening trial and published literature. Model parameters and BRCA population prevalence for individuals with 3, 2, or 1 Ashkenazi-Jewish grandparent were adjusted for the relative frequency of BRCA mutations in the Ashkenazi-Jewish and general populations. Incremental cost-effectiveness ratios were calculated for all Ashkenazi-Jewish grandparent scenarios. Costs, along with outcomes, were discounted at 3.5%. The time horizon of the analysis is "life-time," and perspective is "payer." Probabilistic sensitivity analysis evaluated model uncertainty. Population testing for BRCA mutations is cost-saving in Ashkenazi-Jewish women with 2, 3, or 4 grandparents (22-33 days life-gained) in the United Kingdom and 1, 2, 3, or 4 grandparents (12-26 days life-gained) in the United States populations, respectively. It is also extremely cost-effective in women in the United Kingdom with just 1 Ashkenazi-Jewish grandparent with an incremental cost-effectiveness ratio of £863 per quality-adjusted life-years and 15 days life gained. Results show that population-testing remains cost-effective at the £20,000-30000 per quality-adjusted life-years and $100,000 per quality-adjusted life-years willingness-to-pay thresholds for all 4 Ashkenazi-Jewish grandparent scenarios, with ≥95% simulations found to be cost-effective on probabilistic sensitivity analysis. Population-testing remains cost-effective in the absence of reduction in breast cancer risk from oophorectomy and at lower risk-reducing mastectomy (13%) or risk-reducing salpingo-oophorectomy (20%) rates. Population testing for BRCA mutations with varying levels of Ashkenazi-Jewish ancestry is cost-effective in the United Kingdom and the United States. These results support population testing in Ashkenazi-Jewish women with 1-4 Ashkenazi-Jewish grandparent ancestry. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Bayesian analysis of longitudinal dyadic data with informative missing data using a dyadic shared-parameter model.

    PubMed

    Ahn, Jaeil; Morita, Satoshi; Wang, Wenyi; Yuan, Ying

    2017-01-01

    Analyzing longitudinal dyadic data is a challenging task due to the complicated correlations from repeated measurements and within-dyad interdependence, as well as potentially informative (or non-ignorable) missing data. We propose a dyadic shared-parameter model to analyze longitudinal dyadic data with ordinal outcomes and informative intermittent missing data and dropouts. We model the longitudinal measurement process using a proportional odds model, which accommodates the within-dyad interdependence using the concept of the actor-partner interdependence effects, as well as dyad-specific random effects. We model informative dropouts and intermittent missing data using a transition model, which shares the same set of random effects as the longitudinal measurement model. We evaluate the performance of the proposed method through extensive simulation studies. As our approach relies on some untestable assumptions on the missing data mechanism, we perform sensitivity analyses to evaluate how the analysis results change when the missing data mechanism is misspecified. We demonstrate our method using a longitudinal dyadic study of metastatic breast cancer.

  13. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes.

    PubMed

    Achana, Felix A; Cooper, Nicola J; Bujkiewicz, Sylwia; Hubbard, Stephanie J; Kendrick, Denise; Jones, David R; Sutton, Alex J

    2014-07-21

    Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on outcomes not directly considered by the studies included in the analysis. Accounting for the dependency between outcomes in a multivariate meta-analysis may or may not improve the precision of effect estimates from a network meta-analysis compared to analysing each outcome separately.

  14. The effects of teacher anxiety and modeling on the acquisition of a science teaching skill and concomitant student performance

    NASA Astrophysics Data System (ADS)

    Koran, John J., Jr.; Koran, Mary Lou

    In a study designed to explore the effects of teacher anxiety and modeling on acquisition of a science teaching skill and concomitant student performance, 69 preservice secondary teachers and 295 eighth grade students were randomly assigned to microteaching sessions. Prior to microteaching, teachers were given an anxiety test, then randomly assigned to one of three treatments; a transcript model, a protocol model, or a control condition. Subsequently both teacher and student performance was assessed using written and behavioral measures. Analysis of variance indicated that subjects in the two modeling treatments significantly exceeded performance of control group subjects on all measures of the dependent variable, with the protocol model being generally superior to the transcript model. The differential effects of the modeling treatments were further reflected in student performance. Regression analysis of aptitude-treatment interactions indicated that teacher anxiety scores interacted significantly with instructional treatments, with high anxiety teachers performing best in the protocol modeling treatment. Again, this interaction was reflected in student performance, where students taught by highly anxious teachers performed significantly better when their teachers had received the protocol model. These results were discussed in terms of teacher concerns and a memory model of the effects of anxiety on performance.

  15. The impact of structural uncertainty on cost-effectiveness models for adjuvant endocrine breast cancer treatments: the need for disease-specific model standardization and improved guidance.

    PubMed

    Frederix, Gerardus W J; van Hasselt, Johan G C; Schellens, Jan H M; Hövels, Anke M; Raaijmakers, Jan A M; Huitema, Alwin D R; Severens, Johan L

    2014-01-01

    Structural uncertainty relates to differences in model structure and parameterization. For many published health economic analyses in oncology, substantial differences in model structure exist, leading to differences in analysis outcomes and potentially impacting decision-making processes. The objectives of this analysis were (1) to identify differences in model structure and parameterization for cost-effectiveness analyses (CEAs) comparing tamoxifen and anastrazole for adjuvant breast cancer (ABC) treatment; and (2) to quantify the impact of these differences on analysis outcome metrics. The analysis consisted of four steps: (1) review of the literature for identification of eligible CEAs; (2) definition and implementation of a base model structure, which included the core structural components for all identified CEAs; (3) definition and implementation of changes or additions in the base model structure or parameterization; and (4) quantification of the impact of changes in model structure or parameterizations on the analysis outcome metrics life-years gained (LYG), incremental costs (IC) and the incremental cost-effectiveness ratio (ICER). Eleven CEA analyses comparing anastrazole and tamoxifen as ABC treatment were identified. The base model consisted of the following health states: (1) on treatment; (2) off treatment; (3) local recurrence; (4) metastatic disease; (5) death due to breast cancer; and (6) death due to other causes. The base model estimates of anastrazole versus tamoxifen for the LYG, IC and ICER were 0.263 years, €3,647 and €13,868/LYG, respectively. In the published models that were evaluated, differences in model structure included the addition of different recurrence health states, and associated transition rates were identified. Differences in parameterization were related to the incidences of recurrence, local recurrence to metastatic disease, and metastatic disease to death. The separate impact of these model components on the LYG ranged from 0.207 to 0.356 years, while incremental costs ranged from €3,490 to €3,714 and ICERs ranged from €9,804/LYG to €17,966/LYG. When we re-analyzed the published CEAs in our framework by including their respective model properties, the LYG ranged from 0.207 to 0.383 years, IC ranged from €3,556 to €3,731 and ICERs ranged from €9,683/LYG to €17,570/LYG. Differences in model structure and parameterization lead to substantial differences in analysis outcome metrics. This analysis supports the need for more guidance regarding structural uncertainty and the use of standardized disease-specific models for health economic analyses of adjuvant endocrine breast cancer therapies. The developed approach in the current analysis could potentially serve as a template for further evaluations of structural uncertainty and development of disease-specific models.

  16. Links between physical fitness and cardiovascular reactivity and recovery to psychological stressors: A meta-analysis.

    PubMed

    Forcier, Kathleen; Stroud, Laura R; Papandonatos, George D; Hitsman, Brian; Reiches, Meredith; Krishnamoorthy, Jenelle; Niaura, Raymond

    2006-11-01

    A meta-analysis of published studies with adult human participants was conducted to evaluate whether physical fitness attenuates cardiovascular reactivity and improves recovery from acute psychological stressors. Thirty-three studies met selection criteria; 18 were included in recovery analyses. Effect sizes and moderator influences were calculated by using meta-analysis software. A fixed effects model was fit initially; however, between-studies heterogeneity could not be explained even after inclusion of moderators. Therefore, to account for residual heterogeneity, a random effects model was estimated. Under this model, fit individuals showed significantly attenuated heart rate and systolic blood pressure reactivity and a trend toward attenuated diastolic blood pressure reactivity. Fit individuals also showed faster heart rate recovery, but there were no significant differences in systolic blood pressure or diastolic blood pressure recovery. No significant moderators emerged. Results have important implications for elucidating mechanisms underlying effects of fitness on cardiovascular disease and suggest that fitness may be an important confound in studies of stress reactivity. Copyright 2006 APA, all rights reserved.

  17. Job demands, burnout, and engagement among nurses: A multi-level analysis of ORCAB data investigating the moderating effect of teamwork

    PubMed Central

    Montgomery, Anthony; Spânu, Florina; Băban, Adriana; Panagopoulou, Efharis

    2015-01-01

    According to the Job Demands-Resources (JD-R) model, burnout and engagement are psychological reactions that develop when individual characteristics interact with work characteristics. This study tests the JD-R model using multilevel analysis to test the main and moderating effects of teamwork effectiveness among 1156 nurses in 93 departments from seven European countries. Workload, emotional and organizational demands were positively associated with emotional exhaustion, depersonalization, and negatively with vigor. Emotional and organizational demands were negatively associated with dedication. Teamwork effectiveness was positively associated with engagement. We found no evidence for the moderating effect of teamwork effectiveness in reducing individual perceptions of demands. PMID:26877971

  18. Job demands, burnout, and engagement among nurses: A multi-level analysis of ORCAB data investigating the moderating effect of teamwork.

    PubMed

    Montgomery, Anthony; Spânu, Florina; Băban, Adriana; Panagopoulou, Efharis

    2015-09-01

    According to the Job Demands-Resources (JD-R) model, burnout and engagement are psychological reactions that develop when individual characteristics interact with work characteristics. This study tests the JD-R model using multilevel analysis to test the main and moderating effects of teamwork effectiveness among 1156 nurses in 93 departments from seven European countries. Workload, emotional and organizational demands were positively associated with emotional exhaustion, depersonalization, and negatively with vigor. Emotional and organizational demands were negatively associated with dedication. Teamwork effectiveness was positively associated with engagement. We found no evidence for the moderating effect of teamwork effectiveness in reducing individual perceptions of demands.

  19. Evaluation of Aeroservoelastic Effects on Flutter

    NASA Technical Reports Server (NTRS)

    Nagaraja, K. S.; Kraft, raymond; Felt, Larry

    1998-01-01

    The HSCT Flight Controls Group is developing a longitudinal control law, known as Gamma-dot / V, for the NASA HSR program. Currently, this control law is based on a quasi-steady aeroelastic (QSAE) model of the vehicle. This control law was implemented into the p-k flutter analysis process for closed loop aeroservoelastic analysis. The available flexible models, developed for the TCA aeroelastic analysis, were used to assess the effect of control laws on flutter at several different Mach numbers and mass conditions. Significant structures and flight control system interaction was observed during the initial assessment. Figures 1 and 2 present a summary of the effect of total closed loop gain and phase on flutter mechanisms, based on ideal sensors and real sensors, for Mach 0.95 and mass M02 condition. Control laws based on ideal sensors gave rise to increased coupling between the rigid body short period mode and the first symmetric elastic mode. This reduced the stability margins for the first elastic mode and does not meet the required 6 dB gain margin requirement. The effect of "real" sensors significantly increased the structures and control system interactions. This caused the elastic,modes to be highly unstable throughout most of the flight envelope. State-space models were developed for several conditions and then MATLAB program was used for the aeroservoelastic stability analysis. These results provided an independent verification of the p-k flutter analysis findings. Good overall agreement was observed between the p-k flutter analysis and state-space model results for both damping and frequency comparisons. These results are also included in this document.

  20. Advancement of Analysis Method for Electromagnetic Screening Effect of Mountain Tunnel

    NASA Astrophysics Data System (ADS)

    Okutani, Tamio; Nakamura, Nobuyuki; Terada, Natsuki; Fukuda, Mitsuyoshi; Tate, Yutaka; Inada, Satoshi; Itoh, Hidenori; Wakao, Shinji

    In this paper we report advancement of an analysis method for electromagnetic screening effect of mountain tunnel with a multiple conductor circuit model. On A.C. electrified railways it is a great issue to manage the influence of electromagnetic induction caused by feeding circuits. Tunnels are said to have a screening effect to reduce the electromagnetic induction because a large amount of steel is used in the tunnels. But recently the screening effect is less expected because New Austrian Tunneling Method (NATM), in which the amount of steel used is less than in conventional methods, is adopted as the standard tunneling method for constructing mountain tunnels. So we measured and analyzed the actual screening effect of mountain tunnels constructed with NATM. In the process of the analysis we have advanced a method to analyze the screening effect more precisely. In this method we can adequately model tunnel structure as a part of multiple conductor circuit.

  1. Hierarchical Bayes approach for subgroup analysis.

    PubMed

    Hsu, Yu-Yi; Zalkikar, Jyoti; Tiwari, Ram C

    2017-01-01

    In clinical data analysis, both treatment effect estimation and consistency assessment are important for a better understanding of the drug efficacy for the benefit of subjects in individual subgroups. The linear mixed-effects model has been used for subgroup analysis to describe treatment differences among subgroups with great flexibility. The hierarchical Bayes approach has been applied to linear mixed-effects model to derive the posterior distributions of overall and subgroup treatment effects. In this article, we discuss the prior selection for variance components in hierarchical Bayes, estimation and decision making of the overall treatment effect, as well as consistency assessment of the treatment effects across the subgroups based on the posterior predictive p-value. Decision procedures are suggested using either the posterior probability or the Bayes factor. These decision procedures and their properties are illustrated using a simulated example with normally distributed response and repeated measurements.

  2. Nonlinear Poisson Equation for Heterogeneous Media

    PubMed Central

    Hu, Langhua; Wei, Guo-Wei

    2012-01-01

    The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. PMID:22947937

  3. Immortal time bias in observational studies of time-to-event outcomes.

    PubMed

    Jones, Mark; Fowler, Robert

    2016-12-01

    The purpose of the study is to show, through simulation and example, the magnitude and direction of immortal time bias when an inappropriate analysis is used. We compare 4 methods of analysis for observational studies of time-to-event outcomes: logistic regression, standard Cox model, landmark analysis, and time-dependent Cox model using an example data set of patients critically ill with influenza and a simulation study. For the example data set, logistic regression, standard Cox model, and landmark analysis all showed some evidence that treatment with oseltamivir provides protection from mortality in patients critically ill with influenza. However, when the time-dependent nature of treatment exposure is taken account of using a time-dependent Cox model, there is no longer evidence of a protective effect of treatment. The simulation study showed that, under various scenarios, the time-dependent Cox model consistently provides unbiased treatment effect estimates, whereas standard Cox model leads to bias in favor of treatment. Logistic regression and landmark analysis may also lead to bias. To minimize the risk of immortal time bias in observational studies of survival outcomes, we strongly suggest time-dependent exposures be included as time-dependent variables in hazard-based analyses. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. An Analysis of the US Army Advanced Management Program.

    DTIC Science & Technology

    1986-05-04

    Management Principles, Motivation and Control, Organizational Effectiveness , Strategies of Change, Environmental Issues - Social Environment in the 1980’s... Organizational Design, Characteristics of Outstanding Organizations, Characteristics of Effective Leaders, Models and Strategies of Planned... Organizational Effectiveness : Strategy /Environmental Analysis - social, legal, economic, technological, etc. Organizational Behavior - Human

  5. Comprehensive School Reform and Achievement: A Meta-Analysis

    ERIC Educational Resources Information Center

    Borman, Geoffrey D.; Hewes, Gina M.; Overman, Laura T.; Brown, Shelly

    2003-01-01

    This meta-analysis reviews research on the achievement effects of comprehensive school reform (CSR) and summarizes the specific effects of 29 widely implemented models. There are limitations on the overall quantity and quality of the research base, but the overall effects of CSR appear promising. The combined quantity, quality, and statistical…

  6. A UML Profile for State Analysis

    NASA Technical Reports Server (NTRS)

    Murray, Alex; Rasmussen, Robert

    2010-01-01

    State Analysis is a systems engineering methodology for the specification and design of control systems, developed at the Jet Propulsion Laboratory. The methodology emphasizes an analysis of the system under control in terms of States and their properties and behaviors and their effects on each other, a clear separation of the control system from the controlled system, cognizance in the control system of the controlled system's State, goal-based control built on constraining the controlled system's States, and disciplined techniques for State discovery and characterization. State Analysis (SA) introduces two key diagram types: State Effects and Goal Network diagrams. The team at JPL developed a tool for performing State Analysis. The tool includes a drawing capability, backed by a database that supports the diagram types and the organization of the elements of the SA models. But the tool does not support the usual activities of software engineering and design - a disadvantage, since systems to which State Analysis can be applied tend to be very software-intensive. This motivated the work described in this paper: the development of a preliminary Unified Modeling Language (UML) profile for State Analysis. Having this profile would enable systems engineers to specify a system using the methods and graphical language of State Analysis, which is easily linked with a larger system model in SysML (Systems Modeling Language), while also giving software engineers engaged in implementing the specified control system immediate access to and use of the SA model, in the same language, UML, used for other software design. That is, a State Analysis profile would serve as a shared modeling bridge between system and software models for the behavior aspects of the system. This paper begins with an overview of State Analysis and its underpinnings, followed by an overview of the mapping of SA constructs to the UML metamodel. It then delves into the details of these mappings and the constraints associated with them. Finally, we give an example of the use of the profile for expressing an example SA model.

  7. Statistical Analysis of Atmospheric Forecast Model Accuracy - A Focus on Multiple Atmospheric Variables and Location-Based Analysis

    DTIC Science & Technology

    2014-04-01

    WRF ) model is a numerical weather prediction system designed for operational forecasting and atmospheric research. This report examined WRF model... WRF , weather research and forecasting, atmospheric effects 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF...and Forecasting ( WRF ) model. The authors would also like to thank Ms. Sherry Larson, STS Systems Integration, LLC, ARL Technical Publishing Branch

  8. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps

  9. Modeling and Analysis of Multidiscipline Research Teams at NASA Langley Research Center: A Systems Thinking Approach

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Barthelemy, Jean-Francois; Jones, Kenneth M.; Silcox, Richard J.; Silva, Walter A.; Nowaczyk, Ronald H.

    1998-01-01

    Multidisciplinary analysis and design is inherently a team activity due to the variety of required expertise and knowledge. As a team activity, multidisciplinary research cannot escape the issues that affect all teams. The level of technical diversity required to perform multidisciplinary analysis and design makes the teaming aspects even more important. A study was conducted at the NASA Langley Research Center to develop a model of multidiscipline teams that can be used to help understand their dynamics and identify key factors that influence their effectiveness. The study sought to apply the elements of systems thinking to better understand the factors, both generic and Langley-specific, that influence the effectiveness of multidiscipline teams. The model of multidiscipline research teams developed during this study has been valuable in identifying means to enhance team effectiveness, recognize and avoid problem behaviors, and provide guidance for forming and coordinating multidiscipline teams.

  10. Bubbles in Sediments

    DTIC Science & Technology

    1997-09-30

    modeled as either an effective fluid, effective viscoelastic solid, or a saturated poroelastic medium. The analysis included only the breathing mode...separated for each model . Finally, if a sediment is modeled by Biot theory, which describes wave propagation in a saturated poroelastic medium, then two...theory to sediment acoustics . The predicted resonance behavior under each model is distinct, so an optical extinction measurement may provide an

  11. Evaluation of grid generation technologies from an applied perspective

    NASA Technical Reports Server (NTRS)

    Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.

    1995-01-01

    An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.

  12. Seismic damage analysis of the outlet piers of arch dams using the finite element sub-model method

    NASA Astrophysics Data System (ADS)

    Song, Liangfeng; Wu, Mingxin; Wang, Jinting; Xu, Yanjie

    2016-09-01

    This study aims to analyze seismic damage of reinforced outlet piers of arch dams by the nonlinear finite element (FE) sub-model method. First, the dam-foundation system is modeled and analyzed, in which the effects of infinite foundation, contraction joints, and nonlinear concrete are taken into account. The detailed structures of the outlet pier are then simulated with a refined FE model in the sub-model analysis. In this way the damage mechanism of the plain (unreinforced) outlet pier is analyzed, and the effects of two reinforcement measures (i.e., post-tensioned anchor cables and reinforcing bar) on the dynamic damage to the outlet pier are investigated comprehensively. Results show that the plain pier is damaged severely by strong earthquakes while implementation of post-tensioned anchor cables strengthens the pier effectively. In addition, radiation damping strongly alleviates seismic damage to the piers.

  13. System analysis for the Huntsville Operational Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Mauldin, J.

    1984-01-01

    The Huntsville Operations Support Center (HOSC) is a distributed computer system used to provide real time data acquisition, analysis and display during NASA space missions and to perform simulation and study activities during non-mission times. The primary purpose is to provide a HOSC system simulation model that is used to investigate the effects of various HOSC system configurations. Such a model would be valuable in planning the future growth of HOSC and in ascertaining the effects of data rate variations, update table broadcasting and smart display terminal data requirements on the HOSC HYPERchannel network system. A simulation model was developed in PASCAL and results of the simulation model for various system configuraions were obtained. A tutorial of the model is presented and the results of simulation runs are presented. Some very high data rate situations were simulated to observe the effects of the HYPERchannel switch over from contention to priority mode under high channel loading.

  14. Kinetic analysis of the effects of target structure on siRNA efficiency

    NASA Astrophysics Data System (ADS)

    Chen, Jiawen; Zhang, Wenbing

    2012-12-01

    RNAi efficiency for target cleavage and protein expression is related to the target structure. Considering the RNA-induced silencing complex (RISC) as a multiple turnover enzyme, we investigated the effect of target mRNA structure on siRNA efficiency with kinetic analysis. The 4-step model was used to study the target cleavage kinetic process: hybridization nucleation at an accessible target site, RISC-mRNA hybrid elongation along with mRNA target structure melting, target cleavage, and enzyme reactivation. At this model, the terms accounting for the target accessibility, stability, and the seed and the nucleation site effects are all included. The results are in good agreement with that of experiments which show different arguments about the structure effects on siRNA efficiency. It shows that the siRNA efficiency is influenced by the integrated factors of target's accessibility, stability, and the seed effects. To study the off-target effects, a simple model of one siRNA binding to two mRNA targets was designed. By using this model, the possibility for diminishing the off-target effects by the concentration of siRNA was discussed.

  15. Counseling Outcomes from 1990 to 2008 for School-Age Youth with Depression: A Meta-Analysis

    ERIC Educational Resources Information Center

    Erford, Bradley T.; Erford, Breann M.; Lattanzi, Gina; Weller, Janet; Schein, Hallie; Wolf, Emily; Hughes, Meredith; Darrow, Jenna; Savin-Murphy, Janet; Peacock, Elizabeth

    2011-01-01

    Clinical trials exploring the effectiveness of counseling and psychotherapy in treatment of depression in school-age youth composed this meta-analysis. Results were synthesized using a random effects model for mean difference and mean gain effect size estimates. No effects of moderating variables were evident. Counseling and psychotherapy are…

  16. Variation in Estimated Ozone-Related Health Impacts of Climate Change due to Modeling Choices and Assumptions

    PubMed Central

    Post, Ellen S.; Grambsch, Anne; Weaver, Chris; Morefield, Philip; Leung, Lai-Yung; Nolte, Christopher G.; Adams, Peter; Liang, Xin-Zhong; Zhu, Jin-Hong; Mahoney, Hardee

    2012-01-01

    Background: Future climate change may cause air quality degradation via climate-induced changes in meteorology, atmospheric chemistry, and emissions into the air. Few studies have explicitly modeled the potential relationships between climate change, air quality, and human health, and fewer still have investigated the sensitivity of estimates to the underlying modeling choices. Objectives: Our goal was to assess the sensitivity of estimated ozone-related human health impacts of climate change to key modeling choices. Methods: Our analysis included seven modeling systems in which a climate change model is linked to an air quality model, five population projections, and multiple concentration–response functions. Using the U.S. Environmental Protection Agency’s (EPA’s) Environmental Benefits Mapping and Analysis Program (BenMAP), we estimated future ozone (O3)-related health effects in the United States attributable to simulated climate change between the years 2000 and approximately 2050, given each combination of modeling choices. Health effects and concentration–response functions were chosen to match those used in the U.S. EPA’s 2008 Regulatory Impact Analysis of the National Ambient Air Quality Standards for O3. Results: Different combinations of methodological choices produced a range of estimates of national O3-related mortality from roughly 600 deaths avoided as a result of climate change to 2,500 deaths attributable to climate change (although the large majority produced increases in mortality). The choice of the climate change and the air quality model reflected the greatest source of uncertainty, with the other modeling choices having lesser but still substantial effects. Conclusions: Our results highlight the need to use an ensemble approach, instead of relying on any one set of modeling choices, to assess the potential risks associated with O3-related human health effects resulting from climate change. PMID:22796531

  17. Incorporating Probability Models of Complex Test Structures to Perform Technology Independent FPGA Single Event Upset Analysis

    NASA Technical Reports Server (NTRS)

    Berg, M. D.; Kim, H. S.; Friendlich, M. A.; Perez, C. E.; Seidlick, C. M.; LaBel, K. A.

    2011-01-01

    We present SEU test and analysis of the Microsemi ProASIC3 FPGA. SEU Probability models are incorporated for device evaluation. Included is a comparison to the RTAXS FPGA illustrating the effectiveness of the overall testing methodology.

  18. Using the Expedition Leader Style Analysis.

    ERIC Educational Resources Information Center

    Phipps, Maurice L.; Phipps, Cynthia A.

    The Expedition Leader Style Analysis (ELSA) is an inventory designed to measure leadership style adaptability and effectiveness in terms of the situational leadership model. Situational leadership arose from the Experiential Leadership Education model, which is used in business and management, by replacing management jargon and phrases with…

  19. Community Water System Regionalization and Stakeholder Implications: Estimating Effects to Consumers and Purveyors (PREPRINT)

    DTIC Science & Technology

    2011-01-01

    gallon. The data are cross sectional and a Breusch - Pagan test finds that heteroscedasticity is a problem. To correct for it, the analysis re...heteroscedasticity after a fixed effect model uses a Breusch and Pagan Lagrange multiplier test (Baum, 2006a). After a random effects model the test is a...EFFECTS 17 The data originate from 33 CWSs over 13 years so the next step is to test for CWS specific effects. The FE model in the table presents

  20. Linear analysis of a force reflective teleoperator

    NASA Technical Reports Server (NTRS)

    Biggers, Klaus B.; Jacobsen, Stephen C.; Davis, Clark C.

    1989-01-01

    Complex force reflective teleoperation systems are often very difficult to analyze due to the large number of components and control loops involved. One mode of a force reflective teleoperator is described. An analysis of the performance of the system based on a linear analysis of the general full order model is presented. Reduced order models are derived and correlated with the full order models. Basic effects of force feedback and position feedback are examined and the effects of time delays between the master and slave are studied. The results show that with symmetrical position-position control of teleoperators, a basic trade off must be made between the intersystem stiffness of the teleoperator, and the impedance felt by the operator in free space.

  1. A cost-effectiveness analysis of typhoid fever vaccines in US military personnel.

    PubMed

    Warren, T A; Finder, S F; Brier, K L; Ries, A J; Weber, M P; Miller, M R; Potyk, R P; Reeves, C S; Moran, E L; Tornow, J J

    1996-11-01

    Typhoid fever has been a problem for military personnel throughout history. A cost-effectiveness analysis of typhoid fever vaccines from the perspective of the US military was performed. Currently 3 vaccine preparations are available in the US: an oral live Type 21A whole cell vaccine; a single-dose parenteral, cell subunit vaccine; and a 2-dose parenteral heat-phenol killed, whole cell vaccine. This analysis assumed all vaccinees were US military personnel. Two pharmacoeconomic models were developed, one for personnel who have not yet been deployed, and the other for personnel who are deployed to an area endemic for typhoid fever. Drug acquisition, administration, adverse effect and lost work costs, as well as the costs associated with typhoid fever, were included in this analysis. Unique military issues, typhoid fever attack rates, vaccine efficacy, and compliance with each vaccine's dosage regimen were included in this analysis. A sensitivity analysis was performed to test the robustness of the models. Typhoid fever immunisation is not cost-effective for US military personnel unless they are considered imminently deployable or are deployed. The most cost-effective vaccine for US military personnel is the single-dose, cell subunit parenteral vaccine.

  2. Penalized discriminant analysis for the detection of wild-grown and cultivated Ganoderma lucidum using Fourier transform infrared spectroscopy.

    PubMed

    Zhu, Ying; Tan, Tuck Lee

    2016-04-15

    An effective and simple analytical method using Fourier transform infrared (FTIR) spectroscopy to distinguish wild-grown high-quality Ganoderma lucidum (G. lucidum) from cultivated one is of essential importance for its quality assurance and medicinal value estimation. Commonly used chemical and analytical methods using full spectrum are not so effective for the detection and interpretation due to the complex system of the herbal medicine. In this study, two penalized discriminant analysis models, penalized linear discriminant analysis (PLDA) and elastic net (Elnet),using FTIR spectroscopy have been explored for the purpose of discrimination and interpretation. The classification performances of the two penalized models have been compared with two widely used multivariate methods, principal component discriminant analysis (PCDA) and partial least squares discriminant analysis (PLSDA). The Elnet model involving a combination of L1 and L2 norm penalties enabled an automatic selection of a small number of informative spectral absorption bands and gave an excellent classification accuracy of 99% for discrimination between spectra of wild-grown and cultivated G. lucidum. Its classification performance was superior to that of the PLDA model in a pure L1 setting and outperformed the PCDA and PLSDA models using full wavelength. The well-performed selection of informative spectral features leads to substantial reduction in model complexity and improvement of classification accuracy, and it is particularly helpful for the quantitative interpretations of the major chemical constituents of G. lucidum regarding its anti-cancer effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Modeling of rolling element bearing mechanics. Theoretical manual

    NASA Technical Reports Server (NTRS)

    Merchant, David H.; Greenhill, Lyn M.

    1994-01-01

    This report documents the theoretical basis for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings; duplex angular contact ball bearings; and cylindrical roller bearings. The model includes the effects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program; and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. A companion report addresses the input instructions for and features of the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  4. Comparing model-based and model-free analysis methods for QUASAR arterial spin labeling perfusion quantification.

    PubMed

    Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J

    2013-05-01

    Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.

  5. Effect of practical training on the learning motivation profile of Japanese pharmacy students using structural equation modeling

    PubMed Central

    2017-01-01

    Purpose To establish a model of Japanese pharmacy students’ learning motivation profile and investigate the effects of pharmaceutical practical training programs on their learning motivation. Methods The Science Motivation Questionnaire II was administered to pharmacy students in their 4th (before practical training), 5th (before practical training at clinical sites), and 6th (after all practical training) years of study at Josai International University in April, 2016. Factor analysis and multiple-group structural equation modeling were conducted for data analysis. Results A total of 165 students participated. The learning motivation profile was modeled with 4 factors (intrinsic, career, self-determination, and grade motivation), and the most effective learning motivation was grade motivation. In the multiple-group analysis, the fit of the model with the data was acceptable, and the estimated mean value of the factor of ‘self-determination’ in the learning motivation profile increased after the practical training programs (P= 0.048, Cohen’s d= 0.43). Conclusion Practical training programs in a 6-year course were effective for increasing learning motivation, based on ‘self-determination’ among Japanese pharmacy students. The results suggest that practical training programs are meaningful not only for providing clinical experience but also for raising learning motivation. PMID:28167812

  6. Integrated modeling approach for optimal management of water, energy and food security nexus

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Vesselinov, Velimir V.

    2017-03-01

    Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.

  7. An investigation of difficulties experienced by students developing unified modelling language (UML) class and sequence diagrams

    NASA Astrophysics Data System (ADS)

    Sien, Ven Yu

    2011-12-01

    Object-oriented analysis and design (OOAD) is not an easy subject to learn. There are many challenges confronting students when studying OOAD. Students have particular difficulty abstracting real-world problems within the context of OOAD. They are unable to effectively build object-oriented (OO) models from the problem domain because they essentially do not know "what" to model. This article investigates the difficulties and misconceptions undergraduate students have with analysing systems using unified modelling language analysis class and sequence diagrams. These models were chosen because they represent important static and dynamic aspects of the software system under development. The results of this study will help students produce effective OO models, and facilitate software engineering lecturers design learning materials and approaches for introductory OOAD courses.

  8. Investigating Experimental Effects within the Framework of Structural Equation Modeling: An Example with Effects on Both Error Scores and Reaction Times

    ERIC Educational Resources Information Center

    Schweizer, Karl

    2008-01-01

    Structural equation modeling provides the framework for investigating experimental effects on the basis of variances and covariances in repeated measurements. A special type of confirmatory factor analysis as part of this framework enables the appropriate representation of the experimental effect and the separation of experimental and…

  9. Estimation of Spatial Dynamic Nonparametric Durbin Models with Fixed Effects

    ERIC Educational Resources Information Center

    Qian, Minghui; Hu, Ridong; Chen, Jianwei

    2016-01-01

    Spatial panel data models have been widely studied and applied in both scientific and social science disciplines, especially in the analysis of spatial influence. In this paper, we consider the spatial dynamic nonparametric Durbin model (SDNDM) with fixed effects, which takes the nonlinear factors into account base on the spatial dynamic panel…

  10. Effect of the Implicit Combinatorial Model on Combinatorial Reasoning in Secondary School Pupils.

    ERIC Educational Resources Information Center

    Batanero, Carmen; And Others

    1997-01-01

    Elementary combinatorial problems may be classified into three different combinatorial models: (1) selection; (2) partition; and (3) distribution. The main goal of this research was to determine the effect of the implicit combinatorial model on pupils' combinatorial reasoning before and after instruction. Gives an analysis of variance of the…

  11. Achievement Emotions and Academic Performance: Longitudinal Models of Reciprocal Effects

    ERIC Educational Resources Information Center

    Pekrun, Reinhard; Lichtenfeld, Stephanie; Marsh, Herbert W.; Murayama, Kou; Goetz, Thomas

    2017-01-01

    A reciprocal effects model linking emotion and achievement over time is proposed. The model was tested using five annual waves of the Project for the Analysis of Learning and Achievement in Mathematics (PALMA) longitudinal study, which investigated adolescents' development in mathematics (Grades 5-9; N = 3,425 German students; mean starting…

  12. Effect of temperature on microbial growth rate - thermodynamic analysis, the arrhenius and eyring-polanyi connection

    USDA-ARS?s Scientific Manuscript database

    The objective of this work is to develop a new thermodynamic mathematical model for evaluating the effect of temperature on the rate of microbial growth. The new mathematical model is derived by combining the Arrhenius equation and the Eyring-Polanyi transition theory. The new model, suitable for ...

  13. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  14. Versatility of Cooperative Transcriptional Activation: A Thermodynamical Modeling Analysis for Greater-Than-Additive and Less-Than-Additive Effects

    PubMed Central

    Frank, Till D.; Carmody, Aimée M.; Kholodenko, Boris N.

    2012-01-01

    We derive a statistical model of transcriptional activation using equilibrium thermodynamics of chemical reactions. We examine to what extent this statistical model predicts synergy effects of cooperative activation of gene expression. We determine parameter domains in which greater-than-additive and less-than-additive effects are predicted for cooperative regulation by two activators. We show that the statistical approach can be used to identify different causes of synergistic greater-than-additive effects: nonlinearities of the thermostatistical transcriptional machinery and three-body interactions between RNA polymerase and two activators. In particular, our model-based analysis suggests that at low transcription factor concentrations cooperative activation cannot yield synergistic greater-than-additive effects, i.e., DNA transcription can only exhibit less-than-additive effects. Accordingly, transcriptional activity turns from synergistic greater-than-additive responses at relatively high transcription factor concentrations into less-than-additive responses at relatively low concentrations. In addition, two types of re-entrant phenomena are predicted. First, our analysis predicts that under particular circumstances transcriptional activity will feature a sequence of less-than-additive, greater-than-additive, and eventually less-than-additive effects when for fixed activator concentrations the regulatory impact of activators on the binding of RNA polymerase to the promoter increases from weak, to moderate, to strong. Second, for appropriate promoter conditions when activator concentrations are increased then the aforementioned re-entrant sequence of less-than-additive, greater-than-additive, and less-than-additive effects is predicted as well. Finally, our model-based analysis suggests that even for weak activators that individually induce only negligible increases in promoter activity, promoter activity can exhibit greater-than-additive responses when transcription factors and RNA polymerase interact by means of three-body interactions. Overall, we show that versatility of transcriptional activation is brought about by nonlinearities of transcriptional response functions and interactions between transcription factors, RNA polymerase and DNA. PMID:22506020

  15. Effects of winglet on transonic flutter characteristics of a cantilevered twin-engine-transport wing model

    NASA Technical Reports Server (NTRS)

    Ruhlin, C. L.; Bhatia, K. G.; Nagaraja, K. S.

    1986-01-01

    A transonic model and a low-speed model were flutter tested in the Langley Transonic Dynamics Tunnel at Mach numbers up to 0.90. Transonic flutter boundaries were measured for 10 different model configurations, which included variations in wing fuel, nacelle pylon stiffness, and wingtip configuration. The winglet effects were evaluated by testing the transonic model, having a specific wing fuel and nacelle pylon stiffness, with each of three wingtips, a nonimal tip, a winglet, and a nominal tip ballasted to simulate the winglet mass. The addition of the winglet substantially reduced the flutter speed of the wing at transonic Mach numbers. The winglet effect was configuration-dependent and was primarily due to winglet aerodynamics rather than mass. Flutter analyses using modified strip-theory aerodynamics (experimentally weighted) correlated reasonably well with test results. The four transonic flutter mechanisms predicted by analysis were obtained experimentally. The analysis satisfactorily predicted the mass-density-ratio effects on subsonic flutter obtained using the low-speed model. Additional analyses were made to determine the flutter sensitivity to several parameters at transonic speeds.

  16. Clustered multistate models with observation level random effects, mover-stayer effects and dynamic covariates: modelling transition intensities and sojourn times in a study of psoriatic arthritis.

    PubMed

    Yiu, Sean; Farewell, Vernon T; Tom, Brian D M

    2018-02-01

    In psoriatic arthritis, it is important to understand the joint activity (represented by swelling and pain) and damage processes because both are related to severe physical disability. The paper aims to provide a comprehensive investigation into both processes occurring over time, in particular their relationship, by specifying a joint multistate model at the individual hand joint level, which also accounts for many of their important features. As there are multiple hand joints, such an analysis will be based on the use of clustered multistate models. Here we consider an observation level random-effects structure with dynamic covariates and allow for the possibility that a subpopulation of patients is at minimal risk of damage. Such an analysis is found to provide further understanding of the activity-damage relationship beyond that provided by previous analyses. Consideration is also given to the modelling of mean sojourn times and jump probabilities. In particular, a novel model parameterization which allows easily interpretable covariate effects to act on these quantities is proposed.

  17. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications

    PubMed Central

    Austin, Peter C.

    2017-01-01

    Summary Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log–log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata). PMID:29307954

  18. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications.

    PubMed

    Austin, Peter C

    2017-08-01

    Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log-log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata).

  19. Is job a viable unit of analysis? A multilevel analysis of demand-control-support models.

    PubMed

    Morrison, David; Payne, Roy L; Wall, Toby D

    2003-07-01

    The literature has ignored the fact that the demand-control (DC) and demand-control-support (DCS) models of stress are about jobs and not individuals' perceptions of their jobs. Using multilevel modeling, the authors report results of individual- and job-level analyses from a study of over 6,700 people in 81 different jobs. Support for additive versions of the models came when individuals were the unit of analysis. DC and DCS models are only helpful for understanding the effects of individual perceptions of jobs and their relationship to psychological states. When job perceptions are aggregated and their relationship to the collective experience of jobholders is assessed, the models prove of little value. Role set may be a better unit of analysis.

  20. Dynamic GSCA (Generalized Structured Component Analysis) with Applications to the Analysis of Effective Connectivity in Functional Neuroimaging Data

    ERIC Educational Resources Information Center

    Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S.

    2012-01-01

    We propose a new method of structural equation modeling (SEM) for longitudinal and time series data, named Dynamic GSCA (Generalized Structured Component Analysis). The proposed method extends the original GSCA by incorporating a multivariate autoregressive model to account for the dynamic nature of data taken over time. Dynamic GSCA also…

  1. Review on the Modeling of Electrostatic MEMS

    PubMed Central

    Chuang, Wan-Chun; Lee, Hsin-Li; Chang, Pei-Zen; Hu, Yuh-Chung

    2010-01-01

    Electrostatic-driven microelectromechanical systems devices, in most cases, consist of couplings of such energy domains as electromechanics, optical electricity, thermoelectricity, and electromagnetism. Their nonlinear working state makes their analysis complex and complicated. This article introduces the physical model of pull-in voltage, dynamic characteristic analysis, air damping effect, reliability, numerical modeling method, and application of electrostatic-driven MEMS devices. PMID:22219707

  2. Modelling a Complex System: Using Novice-Expert Analysis for Developing an Effective Technology-Enhanced Learning Environment

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai

    2010-01-01

    The purposes of this article are to present the design of a technology-enhanced learning environment (Air Pollution Modeling Environment [APoME]) that was informed by a novice-expert analysis and to discuss high school students' development of modelling practices in the learning environment. APoME was designed to help high school students…

  3. Software reliability: Application of a reliability model to requirements error analysis

    NASA Technical Reports Server (NTRS)

    Logan, J.

    1980-01-01

    The application of a software reliability model having a well defined correspondence of computer program properties to requirements error analysis is described. Requirements error categories which can be related to program structural elements are identified and their effect on program execution considered. The model is applied to a hypothetical B-5 requirement specification for a program module.

  4. Improving models of democracy: the example of lagged effects of economic development, education, and gender equality.

    PubMed

    Balaev, Mikhail

    2014-07-01

    The author examines how time delayed effects of economic development, education, and gender equality influence political democracy. Literature review shows inadequate understanding of lagged effects, which raises methodological and theoretical issues with the current quantitative studies of democracy. Using country-years as a unit of analysis, the author estimates a series of OLS PCSE models for each predictor with a systematic analysis of the distributions of the lagged effects. The second set of multiple OLS PCSE regressions are estimated including all three independent variables. The results show that economic development, education, and gender have three unique trajectories of the time-delayed effects: Economic development has long-term effects, education produces continuous effects regardless of the timing, and gender equality has the most prominent immediate and short term effects. The results call for the reassessment of model specifications and theoretical setups in the quantitative studies of democracy. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Use of Linear Prediction Uncertainty Analysis to Guide Conditioning of Models Simulating Surface-Water/Groundwater Interactions

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; White, J.; Doherty, J.

    2011-12-01

    Linear prediction uncertainty analysis in a Bayesian framework was applied to guide the conditioning of an integrated surface water/groundwater model that will be used to predict the effects of groundwater withdrawals on surface-water and groundwater flows. Linear prediction uncertainty analysis is an effective approach for identifying (1) raw and processed data most effective for model conditioning prior to inversion, (2) specific observations and periods of time critically sensitive to specific predictions, and (3) additional observation data that would reduce model uncertainty relative to specific predictions. We present results for a two-dimensional groundwater model of a 2,186 km2 area of the Biscayne aquifer in south Florida implicitly coupled to a surface-water routing model of the actively managed canal system. The model domain includes 5 municipal well fields withdrawing more than 1 Mm3/day and 17 operable surface-water control structures that control freshwater releases from the Everglades and freshwater discharges to Biscayne Bay. More than 10 years of daily observation data from 35 groundwater wells and 24 surface water gages are available to condition model parameters. A dense parameterization was used to fully characterize the contribution of the inversion null space to predictive uncertainty and included bias-correction parameters. This approach allows better resolution of the boundary between the inversion null space and solution space. Bias-correction parameters (e.g., rainfall, potential evapotranspiration, and structure flow multipliers) absorb information that is present in structural noise that may otherwise contaminate the estimation of more physically-based model parameters. This allows greater precision in predictions that are entirely solution-space dependent, and reduces the propensity for bias in predictions that are not. Results show that application of this analysis is an effective means of identifying those surface-water and groundwater data, both raw and processed, that minimize predictive uncertainty, while simultaneously identifying the maximum solution-space dimensionality of the inverse problem supported by the data.

  6. Scenario Analysis of Soil and Water Conservation in Xiejia Watershed Based on Improved CSLE Model

    NASA Astrophysics Data System (ADS)

    Liu, Jieying; Yu, Ming; Wu, Yong; Huang, Yao; Nie, Yawen

    2018-01-01

    According to the existing research results and related data, use the scenario analysis method, to evaluate the effects of different soil and water conservation measures on soil erosion in a small watershed. Based on the analysis of soil erosion scenarios and model simulation budgets in the study area, it is found that all scenarios simulated soil erosion rates are lower than the present situation of soil erosion in 2013. Soil and water conservation measures are more effective in reducing soil erosion than soil and water conservation biological measures and soil and water conservation tillage measures.

  7. Evaluation of Fish Passage at Whitewater Parks Using 2D and 3D Hydraulic Modeling

    NASA Astrophysics Data System (ADS)

    Hardee, T.; Nelson, P. A.; Kondratieff, M.; Bledsoe, B. P.

    2016-12-01

    In-stream whitewater parks (WWPs) are increasingly popular recreational amenities that typically create waves by constricting flow through a chute to increase velocities and form a hydraulic jump. However, the hydraulic conditions these structures create can limit longitudinal habitat connectivity and potentially inhibit upstream fish migration, especially of native fishes. An improved understanding of the fundamental hydraulic processes and potential environmental effects of whitewater parks is needed to inform management decisions about Recreational In-Channel Diversions (RICDs). Here, we use hydraulic models to compute a continuous and spatially explicit description of velocity and depth along potential fish swimming paths in the flow field, and the ensemble of potential paths are compared to fish swimming performance data to predict fish passage via logistic regression analysis. While 3d models have been shown to accurately predict trout movement through WWP structures, 2d methods can provide a more cost-effective and manager-friendly approach to assessing the effects of similar hydraulic structures on fish passage when 3d analysis in not feasible. Here, we use 2d models to examine the hydraulics in several WWP structures on the North Fork of the St. Vrain River at Lyons, Colorado, and we compare these model results to fish passage predictions from a 3d model. Our analysis establishes a foundation for a practical, transferable and physically-rigorous 2d modeling approach for mechanistically evaluating the effects of hydraulic structures on fish passage.

  8. A Bifactor Approach to Model Multifaceted Constructs in Statistical Mediation Analysis.

    PubMed

    Gonzalez, Oscar; MacKinnon, David P

    Statistical mediation analysis allows researchers to identify the most important mediating constructs in the causal process studied. Identifying specific mediators is especially relevant when the hypothesized mediating construct consists of multiple related facets. The general definition of the construct and its facets might relate differently to an outcome. However, current methods do not allow researchers to study the relationships between general and specific aspects of a construct to an outcome simultaneously. This study proposes a bifactor measurement model for the mediating construct as a way to parse variance and represent the general aspect and specific facets of a construct simultaneously. Monte Carlo simulation results are presented to help determine the properties of mediated effect estimation when the mediator has a bifactor structure and a specific facet of a construct is the true mediator. This study also investigates the conditions when researchers can detect the mediated effect when the multidimensionality of the mediator is ignored and treated as unidimensional. Simulation results indicated that the mediation model with a bifactor mediator measurement model had unbiased and adequate power to detect the mediated effect with a sample size greater than 500 and medium a - and b -paths. Also, results indicate that parameter bias and detection of the mediated effect in both the data-generating model and the misspecified model varies as a function of the amount of facet variance represented in the mediation model. This study contributes to the largely unexplored area of measurement issues in statistical mediation analysis.

  9. Cost-effectiveness Analysis of Nutritional Support for the Prevention of Pressure Ulcers in High-Risk Hospitalized Patients.

    PubMed

    Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A

    2016-06-01

    To evaluate the cost-effectiveness of nutritional support compared with standard care in preventing pressure ulcers (PrUs) in high-risk hospitalized patients. An economic model using data from a systematic literature review. A meta-analysis of randomized controlled trials on the efficacy of nutritional support in reducing the incidence of PrUs was conducted. Modeled cohort of hospitalized patients at high risk of developing PrUs and malnutrition simulated during their hospital stay and up to 1 year. Standard care included PrU prevention strategies, such as redistribution surfaces, repositioning, and skin protection strategies, along with standard hospital diet. In addition to the standard care, the intervention group received nutritional support comprising patient education, nutrition goal setting, and the consumption of high-protein supplements. The analysis was from a healthcare payer perspective. Key outcomes of the model included the average costs and quality-adjusted life years. Model results were tested in univariate sensitivity analyses, and decision uncertainty was characterized using a probabilistic sensitivity analysis. Compared with standard care, nutritional support was cost saving at AU $425 per patient and marginally more effective with an average 0.005 quality-adjusted life years gained. The probability of nutritional support being cost-effective was 87%. Nutritional support to prevent PrUs in high-risk hospitalized patients is cost-effective with substantial cost savings predicted. Hospitals should implement the recommendations from the current PrU practice guidelines and offer nutritional support to high-risk patients.

  10. New segregation analysis of panic disorder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vieland, V.J.; Fyer, A.J.; Chapman, T.

    1996-04-09

    We performed simple segregation analyses of panic disorder using 126 families of probands with DSM-III-R panic disorder who were ascertained for a family study of anxiety disorders at an anxiety disorders research clinic. We present parameter estimates for dominant, recessive, and arbitrary single major locus models without sex effects, as well as for a nongenetic transmission model, and compare these models to each other and to models obtained by other investigators. We rejected the nongenetic transmission model when comparing it to the recessive model. Consistent with some previous reports, we find comparable support for dominant and recessive models, and inmore » both cases estimate nonzero phenocopy rates. The effect of restricting the analysis to families of probands without any lifetime history of comorbid major depression (MDD) was also examined. No notable differences in parameter estimates were found in that subsample, although the power of that analysis was low. Consistency between the findings in our sample and in another independently collected sample suggests the possibility of pooling such samples in the future in order to achieve the necessary power for more complex analyses. 32 refs., 4 tabs.« less

  11. A Variational Formulation for the Finite Element Analysis of Sound Wave Propagation in a Spherical Shell

    NASA Technical Reports Server (NTRS)

    Lebiedzik, Catherine

    1995-01-01

    Development of design tools to furnish optimal acoustic environments for lightweight aircraft demands the ability to simulate the acoustic system on a workstation. In order to form an effective mathematical model of the phenomena at hand, we have begun by studying the propagation of acoustic waves inside closed spherical shells. Using a fully-coupled fluid-structure interaction model based upon variational principles, we have written a finite element analysis program and are in the process of examining several test cases. Future investigations are planned to increase model accuracy by incorporating non-linear and viscous effects.

  12. Cost-effectiveness and budget impact analyses of a colorectal cancer screening programme in a high adenoma prevalence scenario using MISCAN-Colon microsimulation model.

    PubMed

    Arrospide, Arantzazu; Idigoras, Isabel; Mar, Javier; de Koning, Harry; van der Meulen, Miriam; Soto-Gordoa, Myriam; Martinez-Llorente, Jose Miguel; Portillo, Isabel; Arana-Arri, Eunate; Ibarrondo, Oliver; Lansdorp-Vogelaar, Iris

    2018-04-25

    The Basque Colorectal Cancer Screening Programme began in 2009 and the implementation has been complete since 2013. Faecal immunological testing was used for screening in individuals between 50 and 69 years old. Colorectal Cancer in Basque country is characterized by unusual epidemiological features given that Colorectal Cancer incidence is similar to other European countries while adenoma prevalence is higher. The object of our study was to economically evaluate the programme via cost-effectiveness and budget impact analyses with microsimulation models. We applied the Microsimulation Screening Analysis (MISCAN)-Colon model to predict trends in Colorectal Cancer incidence and mortality and to quantify the short- and long-term effects and costs of the Basque Colorectal Cancer Screening Programme. The model was calibrated to the Basque demographics in 2008 and age-specific Colorectal Cancer incidence data in the Basque Cancer Registry from 2005 to 2008 before the screening begun. The model was also calibrated to the high adenoma prevalence observed for the Basque population in a previously published study. The multi-cohort approach used in the model included all the cohorts in the programme during 30 years of implementation, with lifetime follow-up. Unit costs were obtained from the Basque Health Service and both cost-effectiveness analysis and budget impact analysis were carried out. The goodness-of-fit of the model adaptation to observed programme data was evidence of validation. In the cost-effectiveness analysis, the savings from treatment were larger than the added costs due to screening. Thus, the Basque programme was dominant compared to no screening, as life expectancy increased by 29.3 days per person. The savings in the budget analysis appeared 10 years after the complete implementation of the programme. The average annual budget was €73.4 million from year 2023 onwards. This economic evaluation showed a screening intervention with a major health gain that also produced net savings when a long follow-up was used to capture the late economic benefit. The number of colonoscopies required was high but remain within the capacity of the Basque Health Service. So far in Europe, no other population Colorectal Cancer screening programme has been evaluated by budget impact analysis.

  13. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  14. Economic Modeling and Analysis of Educational Vouchers

    ERIC Educational Resources Information Center

    Epple, Dennis; Romano, Richard

    2012-01-01

    The analysis of educational vouchers has evolved from market-based analogies to models that incorporate distinctive features of the educational environment. These distinctive features include peer effects, scope for private school pricing and admissions based on student characteristics, the linkage of household residential and school choices in…

  15. Random-effects meta-analysis: the number of studies matters.

    PubMed

    Guolo, Annamaria; Varin, Cristiano

    2017-06-01

    This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.

  16. Sensitivity Analysis in Sequential Decision Models.

    PubMed

    Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet

    2017-02-01

    Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.

  17. Cause-effect analysis: improvement of a first year engineering students' calculus teaching model

    NASA Astrophysics Data System (ADS)

    van der Hoff, Quay; Harding, Ansie

    2017-01-01

    This study focuses on the mathematics department at a South African university and in particular on teaching of calculus to first year engineering students. The paper reports on a cause-effect analysis, often used for business improvement. The cause-effect analysis indicates that there are many factors that impact on secondary school teaching of mathematics, factors that the tertiary sector has no control over. The analysis also indicates the undesirable issues that are at the root of impeding success in the calculus module. Most important is that students are not encouraged to become independent thinkers from an early age. This triggers problems in follow-up courses where students are expected to have learned to deal with the work load and understanding of certain concepts. A new model was designed to lessen the impact of these undesirable issues.

  18. [Numerical simulation of the effect of virtual stent release pose on the expansion results].

    PubMed

    Li, Jing; Peng, Kun; Cui, Xinyang; Fu, Wenyu; Qiao, Aike

    2018-04-01

    The current finite element analysis of vascular stent expansion does not take into account the effect of the stent release pose on the expansion results. In this study, stent and vessel model were established by Pro/E. Five kinds of finite element assembly models were constructed by ABAQUS, including 0 degree without eccentricity model, 3 degree without eccentricity model, 5 degree without eccentricity model, 0 degree axial eccentricity model and 0 degree radial eccentricity model. These models were divided into two groups of experiments for numerical simulation with respect to angle and eccentricity. The mechanical parameters such as foreshortening rate, radial recoil rate and dog boning rate were calculated. The influence of angle and eccentricity on the numerical simulation was obtained by comparative analysis. Calculation results showed that the residual stenosis rates were 38.3%, 38.4%, 38.4%, 35.7% and 38.2% respectively for the 5 models. The results indicate that the pose has less effect on the numerical simulation results so that it can be neglected when the accuracy of the result is not highly required, and the basic model as 0 degree without eccentricity model is feasible for numerical simulation.

  19. Models of Students' Thinking Concerning the Greenhouse Effect and Teaching Implications.

    ERIC Educational Resources Information Center

    Koulaidis, Vasilis; Christidou, Vasilia

    1999-01-01

    Primary school students (n=40) ages 11 and 12 years were interviewed concerning their conceptions of the greenhouse effect. Analysis of the data led to the formation of seven distinct models of thinking regarding this phenomenon. (Author/CCM)

  20. Dynamic Modeling of Cell-Free Biochemical Networks Using Effective Kinetic Models

    DTIC Science & Technology

    2015-03-16

    sensitivity value was the maximum uncertainty in that value estimated by the Sobol method. 2.4. Global Sensitivity Analysis of the Reduced Order Coagulation...sensitivity analysis, using the variance-based method of Sobol , to estimate which parameters controlled the performance of the reduced order model [69]. We...Environment. Comput. Sci. Eng. 2007, 9, 90–95. 69. Sobol , I. Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates

  1. Advances in the meta-analysis of heterogeneous clinical trials I: The inverse variance heterogeneity model.

    PubMed

    Doi, Suhail A R; Barendregt, Jan J; Khan, Shahjahan; Thalib, Lukman; Williams, Gail M

    2015-11-01

    This article examines an improved alternative to the random effects (RE) model for meta-analysis of heterogeneous studies. It is shown that the known issues of underestimation of the statistical error and spuriously overconfident estimates with the RE model can be resolved by the use of an estimator under the fixed effect model assumption with a quasi-likelihood based variance structure - the IVhet model. Extensive simulations confirm that this estimator retains a correct coverage probability and a lower observed variance than the RE model estimator, regardless of heterogeneity. When the proposed IVhet method is applied to the controversial meta-analysis of intravenous magnesium for the prevention of mortality after myocardial infarction, the pooled OR is 1.01 (95% CI 0.71-1.46) which not only favors the larger studies but also indicates more uncertainty around the point estimate. In comparison, under the RE model the pooled OR is 0.71 (95% CI 0.57-0.89) which, given the simulation results, reflects underestimation of the statistical error. Given the compelling evidence generated, we recommend that the IVhet model replace both the FE and RE models. To facilitate this, it has been implemented into free meta-analysis software called MetaXL which can be downloaded from www.epigear.com. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  3. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  4. Transient modeling/analysis of hyperbolic heat conduction problems employing mixed implicit-explicit alpha method

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; D'Costa, Joseph F.

    1991-01-01

    This paper describes the evaluation of mixed implicit-explicit finite element formulations for hyperbolic heat conduction problems involving non-Fourier effects. In particular, mixed implicit-explicit formulations employing the alpha method proposed by Hughes et al. (1987, 1990) are described for the numerical simulation of hyperbolic heat conduction models, which involves time-dependent relaxation effects. Existing analytical approaches for modeling/analysis of such models involve complex mathematical formulations for obtaining closed-form solutions, while in certain numerical formulations the difficulties include severe oscillatory solution behavior (which often disguises the true response) in the vicinity of the thermal disturbances, which propagate with finite velocities. In view of these factors, the alpha method is evaluated to assess the control of the amount of numerical dissipation for predicting the transient propagating thermal disturbances. Numerical test models are presented, and pertinent conclusions are drawn for the mixed-time integration simulation of hyperbolic heat conduction models involving non-Fourier effects.

  5. Analysis of an electrohydraulic aircraft control surface servo and comparison with test results

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.

    1972-01-01

    An analysis of an electrohydraulic aircraft control-surface system is made in which the system is modeled as a lumped, two-mass, spring-coupled system controlled by a servo valve. Both linear and nonlinear models are developed, and the effects of hinge-moment loading are included. Transfer functions of the system and approximate literal factors of the transfer functions for several cases are presented. The damping action of dynamic pressure feedback is analyzed. Comparisons of the model responses with results from tests made on a highly resonant rudder control-surface servo indicate the adequacy of the model. The effects of variations in hinge-moment loading are illustrated.

  6. Analysis of the effectiveness of steam retorting of oil shale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs, H.R.; Pensel, R.W.; Udell, K.S.

    A numerical model is developed to describe the retorting of oil shale using superheated steam. The model describes not only the temperature history of the shale but predicts the evolution of shale oil from kerogen decomposition and the breakdown of the carbonates existing in the shale matrix. The heat transfer coefficients between the water and the shale are determined from experiments utilizing the model to reduce the data. Similarly the model is used with thermogravimetric analysis experiments to develop an improved kinetics expression for kerogen decomposition in a steam environment. Numerical results are presented which indicate the effect of oilmore » shale particle size and steam temperature on oil production.« less

  7. Directional pair distribution function for diffraction line profile analysis of atomistic models

    PubMed Central

    Leonardi, Alberto; Leoni, Matteo; Scardi, Paolo

    2013-01-01

    The concept of the directional pair distribution function is proposed to describe line broadening effects in powder patterns calculated from atomistic models of nano-polycrystalline microstructures. The approach provides at the same time a description of the size effect for domains of any shape and a detailed explanation of the strain effect caused by the local atomic displacement. The latter is discussed in terms of different strain types, also accounting for strain field anisotropy and grain boundary effects. The results can in addition be directly read in terms of traditional line profile analysis, such as that based on the Warren–Averbach method. PMID:23396818

  8. Cost-Effectiveness Analysis of Second-Line Chemotherapy Agents for Advanced Gastric Cancer.

    PubMed

    Lam, Simon W; Wai, Maya; Lau, Jessica E; McNamara, Michael; Earl, Marc; Udeh, Belinda

    2017-01-01

    Gastric cancer is the fifth most common malignancy and second leading cause of cancer-related mortality. Chemotherapy options for patients who fail first-line treatment are limited. Thus the objective of this study was to assess the cost-effectiveness of second-line treatment options for patients with advanced or metastatic gastric cancer. Cost-effectiveness analysis using a Markov model to compare the cost-effectiveness of six possible second-line treatment options for patients with advanced gastric cancer who have failed previous chemotherapy: irinotecan, docetaxel, paclitaxel, ramucirumab, paclitaxel plus ramucirumab, and palliative care. The model was performed from a third-party payer's perspective to compare lifetime costs and health benefits associated with studied second-line therapies. Costs included only relevant direct medical costs. The model assumed chemotherapy cycle lengths of 30 days and a maximum number of 24 cycles. Systematic review of literature was performed to identify clinical data sources and utility and cost data. Quality-adjusted life years (QALYs) and incremental cost-effectiveness ratios (ICERs) were calculated. The primary outcome measure for this analysis was the ICER between different therapies, where the incremental cost was divided by the number of QALYs saved. The ICER was compared with a willingness-to-pay (WTP) threshold that was set at $50,000/QALY gained, and an exploratory analysis using $160,000/QALY gained was also used. The model's robustness was tested by using 1-way sensitivity analyses and a 10,000 Monte Carlo simulation probabilistic sensitivity analysis (PSA). Irinotecan had the lowest lifetime cost and was associated with a QALY gain of 0.35 year. Docetaxel, ramucirumab alone, and palliative care were dominated strategies. Paclitaxel and the combination of paclitaxel plus ramucirumab led to higher QALYs gained, at an incremental cost of $86,815 and $1,056,125 per QALY gained, respectively. Based on our prespecified WTP threshold, our base case analysis demonstrated that irinotecan alone is the most cost-effective regimen, and both paclitaxel alone and the combination of paclitaxel and ramucirumab were not cost-effective (ICER more than $50,000). Both 1-way sensitivity analyses and PSA demonstrated the model's robustness. PSA illustrated that paclitaxel plus ramucirumab was extremely unlikely to be cost-effective at a WTP threshold less than $400,000/QALY gained. Irinotecan alone appears to be the most cost-effective second-line regimen for patients with gastric cancer. Paclitaxel may be cost-effective if the WTP threshold was set at $160,000/QALY gained. © 2016 Pharmacotherapy Publications, Inc.

  9. Scientists' internal models of the greenhouse effect

    NASA Astrophysics Data System (ADS)

    Libarkin, J. C.; Miller, H.; Thomas, S. R.

    2013-12-01

    A prior study utilized exploratory factor analysis to identify models underlying drawings of the greenhouse effect made by entering university freshmen. This analysis identified four archetype models of the greenhouse effect that appear within the college enrolling population. The current study collected drawings made by 144 geoscientists, from undergraduate geoscience majors through professionals. These participants scored highly on a standardized assessment of climate change understanding and expressed confidence in their understanding; many also indicated that they teach climate change in their courses. Although geoscientists held slightly more sophisticated greenhouse effect models than entering freshmen, very few held complete, explanatory models. As with freshmen, many scientists (44%) depict greenhouse gases in a layer in the atmosphere; 52% of participants depicted this or another layer as a physical barrier to escaping energy. In addition, 32% of participants indicated that incoming light from the Sun remains unchanged at Earth's surface, in alignment with a common model held by students. Finally, 3-20% of scientists depicted physical greenhouses, ozone, or holes in the atmosphere, all of which correspond to non-explanatory models commonly seen within students and represented in popular literature. For many scientists, incomplete models of the greenhouse effect are clearly enough to allow for reasoning about climate change. These data suggest that: 1) better representations about interdisciplinary concepts, such as the greenhouse effect, are needed for both scientist and public understanding; and 2) the scientific community needs to carefully consider how much understanding of a model is needed before necessary reasoning can occur.

  10. Impact of Domain Analysis on Reuse Methods

    DTIC Science & Technology

    1989-11-06

    return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality

  11. Cost-effectiveness of vaccination against cervical cancer: a multi-regional analysis assessing the impact of vaccine characteristics and alternative vaccination scenarios.

    PubMed

    Suárez, Eugenio; Smith, Jennifer S; Bosch, F Xavier; Nieminen, Pekka; Chen, Chien-Jen; Torvinen, Saku; Demarteau, Nadia; Standaert, Baudouin

    2008-09-15

    Mathematical models provide valuable insights into the public health and economic impact of cervical cancer vaccination programmes. An in-depth economic analysis should explore the effects of different vaccine-related factors and vaccination scenarios (independent of screening practices) on health benefits and costs. In this analysis, a Markov cohort model was used to explore the impact of vaccine characteristics (e.g. cross-type protection and waning of immunity) and different vaccination scenarios (e.g. age at vaccination and multiple cohort strategies) on the cost-effectiveness results of cervical cancer vaccination programmes. The analysis was applied across different regions in the world (Chile, Finland, Ireland, Poland and Taiwan) to describe the influence of location-specific conditions. The results indicate that in all the different settings cervical cancer vaccination becomes more cost-effective with broader and sustained vaccine protection, with vaccination at younger ages, and with the inclusion of several cohorts. When other factors were varied, the cost-effectiveness of vaccination was most negatively impacted by increasing the discount rate applied to costs and health effects.

  12. Improving Learning for All Students through Equity-Based Inclusive Reform Practices: Effectiveness of a Fully Integrated Schoolwide Model on Student Reading and Math Achievement

    ERIC Educational Resources Information Center

    Choi, Jeong Hoon; Meisenheimer, Jessica M.; McCart, Amy B.; Sailor, Wayne

    2017-01-01

    The present investigation examines the schoolwide applications model (SAM) as a potentially effective school reform model for increasing equity-based inclusive education practices while enhancing student reading and math achievement for all students. A 3-year quasi-experimental comparison group analysis using latent growth modeling (LGM) was used…

  13. Considerations in the use of models available for fuel treatment analysis

    Treesearch

    Charles W. McHugh

    2006-01-01

    Fire managers are required to evaluate and justify the effectiveness of planned fuel treatments in modifying fire growth, behavior and effects on resources and assets. With the number of models currently available, today’s fire manager can become overwhelmed when deciding which model to use. Each model has a required level of expertise in order to develop the necessary...

  14. The Research on the Factors of Purchase Intention for Fresh Agricultural Products in an E-Commerce Environment

    NASA Astrophysics Data System (ADS)

    Han, Dan; Mu, Jing

    2017-12-01

    Based on the characteristics of e-commerce of fresh agricultural products in China, and using the correlation analysis method, the relational model between product knowledge, perceived benefit, perceived risk and purchase intention is constructed. The Logistic model is used to carry in the empirical analysis. The influence factors and the mechanism of online purchase intention are explored. The results show that consumers’ product knowledge, perceived benefit and perceived risk can affect their purchase intention. Consumers’ product knowledge has a positive effect on perceived benefit and perceived benefit has a positive effect on purchase intention. Consumers’ product knowledge has a negative effect on perceived risk, and perceived profit has a negative effect on perceived risk, and perceived risk has a negative effect on purchase intention. Through the empirical analysis, some feasible suggestions for the government and electricity supplier enterprises can be provided.

  15. A mathematical analysis of the Janus combat simulation weather effects models and sensitivity analysis of sky-to-ground brightness ratio on target detection

    NASA Astrophysics Data System (ADS)

    Shorts, Vincient F.

    1994-09-01

    The Janus combat simulation offers the user a wide variety of weather effects options to employ during the execution of any simulation run, which can directly influence detection of opposing forces. Realistic weather effects are required if the simulation is to accurately reproduce 'real world' results. This thesis examines the mathematics of the Janus weather effects models. A weather effect option in Janus is the sky-to-ground brightness ratio (SGR). SGR affects an optical sensor's ability to detect targets. It is a measure of the sun angle in relation to the horizon. A review of the derivation of SGR is performed and an analysis of SGR's affect on the number of optical detections and detection ranges is performed using an unmanned aerial vehicle (UAV) search scenario. For comparison, the UAV's are equipped with a combination of optical and thermal sensors.

  16. A Project Team Analysis Using Tuckman's Model of Small-Group Development.

    PubMed

    Natvig, Deborah; Stark, Nancy L

    2016-12-01

    Concerns about equitable workloads for nursing faculty have been well documented, yet a standardized system for workload management does not exist. A project team was challenged to establish an academic workload management system when two dissimilar universities were consolidated. Tuckman's model of small-group development was used as the framework for the analysis of processes and effectiveness of a workload project team. Agendas, notes, and meeting minutes were used as the primary sources of information. Analysis revealed the challenges the team encountered. Utilization of a team charter was an effective tool in guiding the team to become a highly productive group. Lessons learned from the analysis are discussed. Guiding a diverse group into a highly productive team is complex. The use of Tuckman's model of small-group development provided a systematic mechanism to review and understand group processes and tasks. [J Nurs Educ. 2016;55(12):675-681.]. Copyright 2016, SLACK Incorporated.

  17. An Effective Model of the Retinoic Acid Induced HL-60 Differentiation Program.

    PubMed

    Tasseff, Ryan; Jensen, Holly A; Congleton, Johanna; Dai, David; Rogers, Katharine V; Sagar, Adithya; Bunaciu, Rodica P; Yen, Andrew; Varner, Jeffrey D

    2017-10-30

    In this study, we present an effective model All-Trans Retinoic Acid (ATRA)-induced differentiation of HL-60 cells. The model describes reinforcing feedback between an ATRA-inducible signalsome complex involving many proteins including Vav1, a guanine nucleotide exchange factor, and the activation of the mitogen activated protein kinase (MAPK) cascade. We decomposed the effective model into three modules; a signal initiation module that sensed and transformed an ATRA signal into program activation signals; a signal integration module that controlled the expression of upstream transcription factors; and a phenotype module which encoded the expression of functional differentiation markers from the ATRA-inducible transcription factors. We identified an ensemble of effective model parameters using measurements taken from ATRA-induced HL-60 cells. Using these parameters, model analysis predicted that MAPK activation was bistable as a function of ATRA exposure. Conformational experiments supported ATRA-induced bistability. Additionally, the model captured intermediate and phenotypic gene expression data. Knockout analysis suggested Gfi-1 and PPARg were critical to the ATRAinduced differentiation program. These findings, combined with other literature evidence, suggested that reinforcing feedback is central to hyperactive signaling in a diversity of cell fate programs.

  18. Incremental Net Effects in Multiple Regression

    ERIC Educational Resources Information Center

    Lipovetsky, Stan; Conklin, Michael

    2005-01-01

    A regular problem in regression analysis is estimating the comparative importance of the predictors in the model. This work considers the 'net effects', or shares of the predictors in the coefficient of the multiple determination, which is a widely used characteristic of the quality of a regression model. Estimation of the net effects can be a…

  19. Bearing-Load Modeling and Analysis Study for Mechanically Connected Structures

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2006-01-01

    Bearing-load response for a pin-loaded hole is studied within the context of two-dimensional finite element analyses. Pin-loaded-hole configurations are representative of mechanically connected structures, such as a stiffener fastened to a rib of an isogrid panel, that are idealized as part of a larger structural component. Within this context, the larger structural component may be idealized as a two-dimensional shell finite element model to identify load paths and high stress regions. Finite element modeling and analysis aspects of a pin-loaded hole are considered in the present paper including the use of linear and nonlinear springs to simulate the pin-bearing contact condition. Simulating pin-connected structures within a two-dimensional finite element analysis model using nonlinear spring or gap elements provides an effective way for accurate prediction of the local effective stress state and peak forces.

  20. Detecting malicious chaotic signals in wireless sensor network

    NASA Astrophysics Data System (ADS)

    Upadhyay, Ranjit Kumar; Kumari, Sangeeta

    2018-02-01

    In this paper, an e-epidemic Susceptible-Infected-Vaccinated (SIV) model has been proposed to analyze the effect of node immunization and worms attacking dynamics in wireless sensor network. A modified nonlinear incidence rate with cyrtoid type functional response has been considered using sleep and active mode approach. Detailed stability analysis and the sufficient criteria for the persistence of the model system have been established. We also established different types of bifurcation analysis for different equilibria at different critical points of the control parameters. We performed a detailed Hopf bifurcation analysis and determine the direction and stability of the bifurcating periodic solutions using center manifold theorem. Numerical simulations are carried out to confirm the theoretical results. The impact of the control parameters on the dynamics of the model system has been investigated and malicious chaotic signals are detected. Finally, we have analyzed the effect of time delay on the dynamics of the model system.

  1. Rigorous control conditions diminish treatment effects in weight loss randomized controlled trials

    PubMed Central

    Dawson, John A.; Kaiser, Kathryn A.; Affuso, Olivia; Cutter, Gary R.; Allison, David B.

    2015-01-01

    Background It has not been established whether control conditions with large weight losses (WLs) diminish expected treatment effects in WL or prevention of weight gain (PWG) randomized controlled trials (RCTs). Subjects/Methods We performed a meta-analysis of 239 WL/PWG RCTs that include a control group and at least one treatment group. A maximum likelihood meta-analysis framework is used in order to model and understand the relationship between treatment effects and control group outcomes. Results Under the informed model, an increase in control group WL of one kilogram corresponds with an expected shrinkage of the treatment effect by 0.309 kg [95% CI (−0.480, −0.138), p = 0.00081]; this result is robust against violations of the model assumptions. Conclusions We find that control conditions with large weight losses diminish expected treatment effects. Our investigation may be helpful to clinicians as they design future WL/PWG studies. PMID:26449419

  2. Advances and trends in the development of computational models for tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Tanner, J. A.

    1985-01-01

    Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.

  3. International Space Station Configuration Analysis and Integration

    NASA Technical Reports Server (NTRS)

    Anchondo, Rebekah

    2016-01-01

    Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.

  4. Dietary interventions to prevent and manage diabetes in worksite settings: a meta-analysis.

    PubMed

    Shrestha, Archana; Karmacharya, Biraj Man; Khudyakov, Polyna; Weber, Mary Beth; Spiegelman, Donna

    2018-01-25

    The translation of lifestyle intervention to improve glucose tolerance into the workplace has been rare. The objective of this meta-analysis is to summarize the evidence for the effectiveness of dietary interventions in worksite settings on lowering blood sugar levels. We searched for studies in PubMed, Embase, Econlit, Ovid, Cochrane, Web of Science, and Cumulative Index to Nursing and Allied Health Literature. Search terms were as follows: (1) Exposure-based: nutrition/diet/dietary intervention/health promotion/primary prevention/health behavior/health education/food /program evaluation; (2) Outcome-based: diabetes/hyperglycemia/glucose/HbA1c/glycated hemoglobin; and (3) Setting-based: workplace/worksite/occupational/industry/job/employee. We manually searched review articles and reference lists of articles identified from 1969 to December 2016. We tested for between-studies heterogeneity and calculated the pooled effect sizes for changes in HbA1c (%) and fasting glucose (mg/dl) using random effect models for meta-analysis in 2016. A total of 17 articles out of 1663 initially selected articles were included in the meta-analysis. With a random-effects model, worksite dietary interventions led to a pooled -0.18% (95% CI, -0.29 to -0.06; P<0.001) difference in HbA1c. With the random-effects model, the interventions resulted in 2.60 mg/dl lower fasting glucose with borderline significance (95% CI: -5.27 to 0.08, P=0.06). In the multivariate meta-regression model, the interventions with high percent of female participants and that used the intervention directly delivered to individuals, rather the environment changes, were associated with more effective interventions. Workplace dietary interventions can improve HbA1c. The effects were larger for the interventions with greater number of female participants and with individual-level interventions.

  5. A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.

    ERIC Educational Resources Information Center

    Kim, Jin Eun

    A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…

  6. Analysis of radiative and phase-change phenomena with application to space-based thermal energy storage

    NASA Technical Reports Server (NTRS)

    Lund, Kurt O.

    1991-01-01

    The simplified geometry for the analysis is an infinite, axis symmetric annulus with a specified solar flux at the outer radius. The inner radius is either adiabatic (modeling Flight Experiment conditions), or convective (modeling Solar Dynamic conditions). Liquid LiF either contacts the outer wall (modeling ground based testing), or faces a void gap at the outer wall (modeling possible space based conditions). The analysis is presented in three parts: Part 3 considers and adiabatic inner wall and linearized radiation equations; part 2 adds effects of convection at the inner wall; and part 1 includes the effect of the void gap, as well as previous effects, and develops the radiation model further. The main results are the differences in melting behavior which can occur between ground based 1 g experiments and the microgravity flight experiments. Under 1 gravity, melted PCM will always contact the outer wall having the heat flux source, thus providing conductance from this source to the phase change front. In space based tests where a void gap may likely form during solidification, the situation is reversed; radiation is now the only mode of heat transfer and the majority of melting takes place from the inner wall.

  7. Effects of Tropospheric Spatio-Temporal Correlated Noise on the Analysis of Space Geodetic Data

    NASA Technical Reports Server (NTRS)

    Romero-Wolf, A. F.; Jacobs, C. S.

    2011-01-01

    The standard VLBI analysis models measurement noise as purely thermal errors modeled according to uncorrelated Gaussian distributions. As the price of recording bits steadily decreases, thermal errors will soon no longer dominate. It is therefore expected that troposphere and instrumentation/clock errors will increasingly become more dominant. Given that both of these errors have correlated spectra, properly modeling the error distributions will become more relevant for optimal analysis. This paper will discuss the advantages of including the correlations between tropospheric delays using a Kolmogorov spectrum and the frozen ow model pioneered by Treuhaft and Lanyi. We will show examples of applying these correlated noise spectra to the weighting of VLBI data analysis.

  8. Comparative dynamic analysis of the full Grossman model.

    PubMed

    Ried, W

    1998-08-01

    The paper applies the method of comparative dynamic analysis to the full Grossman model. For a particular class of solutions, it derives the equations implicitly defining the complete trajectories of the endogenous variables. Relying on the concept of Frisch decision functions, the impact of any parametric change on an endogenous variable can be decomposed into a direct and an indirect effect. The focus of the paper is on marginal changes in the rate of health capital depreciation. It also analyses the impact of either initial financial wealth or the initial stock of health capital. While the direction of most effects remains ambiguous in the full model, the assumption of a zero consumption benefit of health is sufficient to obtain a definite for any direct or indirect effect.

  9. Structure Damage Simulations Accounting for Inertial Effects and Impact and Optimization of Grid-Stiffened Non-Circular Shells

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Jaunky, Navin

    1999-01-01

    The goal of this research project is to develop modelling and analysis strategy for the penetration of aluminium plates impacted by titanium impactors. Finite element analysis is used to study the penetration of aluminium plates impacted by titanium impactors in order to study the effect of such uncontained engine debris impacts on aircraft-like skin panels. LS-DYNA3D) is used in the simulations to model the impactor, test fixture frame and target barrier plate. The effects of mesh refinement, contact modeling, and impactor initial velocity and orientation were studied. The research project also includes development of a design tool for optimum design of grid-stiffened non-circular shells or panels subjected to buckling.

  10. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  11. USE OF WEIBULL FUNCTION FOR NON-LINEAR ANALYSIS OF EFFECTS OF LOW LEVELS OF SIMULATED HERBICIDE DRIFT ON PLANTS

    EPA Science Inventory

    We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...

  12. A General Approach to Causal Mediation Analysis

    ERIC Educational Resources Information Center

    Imai, Kosuke; Keele, Luke; Tingley, Dustin

    2010-01-01

    Traditionally in the social sciences, causal mediation analysis has been formulated, understood, and implemented within the framework of linear structural equation models. We argue and demonstrate that this is problematic for 3 reasons: the lack of a general definition of causal mediation effects independent of a particular statistical model, the…

  13. Subsatellite Orbital Analysis Program (SOAP) user's guide

    NASA Astrophysics Data System (ADS)

    Castle, K. G.; Voss, J. M.; Gibson, J. S.

    1981-07-01

    The features and use of the subsatellite operational analysis are examined. The model simulates several Earth-orbiting vehicles, their pilots, control systems, and interaction with the environment. The use of the program, input and output capabilities, executive structures, and properties of the vehicles and environmental effects which it models are described.

  14. Subsatellite Orbital Analysis Program (SOAP) user's guide

    NASA Technical Reports Server (NTRS)

    Castle, K. G.; Voss, J. M.; Gibson, J. S.

    1981-01-01

    The features and use of the subsatellite operational analysis are examined. The model simulates several Earth-orbiting vehicles, their pilots, control systems, and interaction with the environment. The use of the program, input and output capabilities, executive structures, and properties of the vehicles and environmental effects which it models are described.

  15. Scaling effects in the impact response of graphite-epoxy composite beams

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.

    1989-01-01

    In support of crashworthiness studies on composite airframes and substructure, an experimental and analytical study was conducted to characterize size effects in the large deflection response of scale model graphite-epoxy beams subjected to impact. Scale model beams of 1/2, 2/3, 3/4, 5/6, and full scale were constructed of four different laminate stacking sequences including unidirectional, angle ply, cross ply, and quasi-isotropic. The beam specimens were subjected to eccentric axial impact loads which were scaled to provide homologous beam responses. Comparisons of the load and strain time histories between the scale model beams and the prototype should verify the scale law and demonstrate the use of scale model testing for determining impact behavior of composite structures. The nonlinear structural analysis finite element program DYCAST (DYnamic Crash Analysis of STructures) was used to model the beam response. DYCAST analysis predictions of beam strain response are compared to experimental data and the results are presented.

  16. Effect of anaerobic digestion on sequential pyrolysis kinetics of organic solid wastes using thermogravimetric analysis and distributed activation energy model.

    PubMed

    Li, Xiaowei; Mei, Qingqing; Dai, Xiaohu; Ding, Guoji

    2017-03-01

    Thermogravimetric analysis, Gaussian-fit-peak model (GFPM), and distributed activation energy model (DAEM) were firstly used to explore the effect of anaerobic digestion on sequential pyrolysis kinetic of four organic solid wastes (OSW). Results showed that the OSW weight loss mainly occurred in the second pyrolysis stage relating to organic matter decomposition. Compared with raw substrate, the weight loss of corresponding digestate was lower in the range of 180-550°C, but was higher in 550-900°C. GFPM analysis revealed that organic components volatized at peak temperatures of 188-263, 373-401 and 420-462°C had a faster degradation rate than those at 274-327°C during anaerobic digestion. DAEM analysis showed that anaerobic digestion had discrepant effects on activation energy for four OSW pyrolysis, possibly because of their different organic composition. It requires further investigation for the special organic matter, i.e., protein-like and carbohydrate-like groups, to confirm the assumption. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Modeled interactive effects of precipitation, temperature, and [CO2] on ecosystem carbon and water dynamics in different climatic zones

    Treesearch

    Yiqi Luo; Dieter Gerten; Guerric Le Maire; William J. Parton; Ensheng Weng; Xuhui Zhou; Cindy Keough; Claus Beier; Philippe Ciais; Wolfgang Cramer; Jeffrey S. Dukes; Bridget Emmett; Paul J. Hanson; Alan Knapp; Sune Linder; Dan Nepstad; Lindsey. Rustad

    2008-01-01

    Interactive effects of multiple global change factors on ecosystem processes are complex. It is relatively expensive to explore those interactions in manipulative experiments. We conducted a modeling analysis to identify potentially important interactions and to stimulate hypothesis formulation for experimental research. Four models were used to quantify interactive...

  18. Methods for network meta-analysis of continuous outcomes using individual patient data: a case study in acupuncture for chronic pain.

    PubMed

    Saramago, Pedro; Woods, Beth; Weatherly, Helen; Manca, Andrea; Sculpher, Mark; Khan, Kamran; Vickers, Andrew J; MacPherson, Hugh

    2016-10-06

    Network meta-analysis methods, which are an extension of the standard pair-wise synthesis framework, allow for the simultaneous comparison of multiple interventions and consideration of the entire body of evidence in a single statistical model. There are well-established advantages to using individual patient data to perform network meta-analysis and methods for network meta-analysis of individual patient data have already been developed for dichotomous and time-to-event data. This paper describes appropriate methods for the network meta-analysis of individual patient data on continuous outcomes. This paper introduces and describes network meta-analysis of individual patient data models for continuous outcomes using the analysis of covariance framework. Comparisons are made between this approach and change score and final score only approaches, which are frequently used and have been proposed in the methodological literature. A motivating example on the effectiveness of acupuncture for chronic pain is used to demonstrate the methods. Individual patient data on 28 randomised controlled trials were synthesised. Consistency of endpoints across the evidence base was obtained through standardisation and mapping exercises. Individual patient data availability avoided the use of non-baseline-adjusted models, allowing instead for analysis of covariance models to be applied and thus improving the precision of treatment effect estimates while adjusting for baseline imbalance. The network meta-analysis of individual patient data using the analysis of covariance approach is advocated to be the most appropriate modelling approach for network meta-analysis of continuous outcomes, particularly in the presence of baseline imbalance. Further methods developments are required to address the challenge of analysing aggregate level data in the presence of baseline imbalance.

  19. The Role of Prostatitis in Prostate Cancer: Meta-Analysis

    PubMed Central

    Yunxia, Zhang; Zhu, Hong; Liu, Junjiang; Pumill, Chris

    2013-01-01

    Objective Use systematic review methods to quantify the association between prostatitis and prostate cancer, under both fixed and random effects model. Evidence Acquisition Case control studies of prostate cancer with information on prostatitis history. All studies published between 1990-2012, were collected to calculate a pooled odds ratio. Selection criteria: the selection criteria are as follows: human case control studies; published from May 1990 to July 2012; containing number of prostatitis, and prostate cancer cases. Evidence Synthesis In total, 20 case control studies were included. A significant association between prostatitis and prostate cancer was found, under both fixed effect model (pooled OR=1.50, 95%CI: 1.39-1.62), and random effects model (OR=1.64, 95%CI: 1.36-1.98). Personal interview based case control studies showed a high level of association (fixed effect model: pooled OR=1.59, 95%CI: 1.47-1.73, random effects model: pooled OR= 1.87, 95%CI: 1.52-2.29), compared with clinical based studies (fixed effect model: pooled OR=1.05, 95%CI: 0.86-1.28, random effects model: pooled OR= 0.98, 95%CI: 0.67-1.45). Additionally, pooled ORs, were calculated for each decade. In a fixed effect model: 1990’s: OR=1.58, 95% CI: 1.35-1.84; 2000’s: OR=1.59, 95% CI: 1.40-1.79; 2010’s: OR=1.37, 95% CI: 1.22-1.56. In a random effects model: 1990’s: OR=1.98, 95% CI: 1.08-3.62; 2000’s: OR=1.64, 95% CI: 1.23-2.19; 2010’s: OR=1.34, 95% CI: 1.03-1.73. Finally a meta-analysis stratified by each country was conducted. In fixed effect models, U.S: pooled OR =1.45, 95%CI: 1.34-1.57; China: pooled OR =4.67, 95%CI: 3.08-7.07; Cuba: pooled OR =1.43, 95%CI: 1.00-2.04; Italy: pooled OR =0.61, 95%CI: 0.13-2.90. In random effects model, U.S: pooled OR=1.50, 95%CI: 1.25-1.80; China: pooled OR =4.67, 95%CI: 3.08-7.07; Cuba: pooled OR =1.43, 95%CI: 1.00-2.04; Italy: pooled OR =0.61, 95%CI: 0.13-2.90.CONCLUSIONS: the present meta-analysis provides the statistical evidence that the association between prostatitis and prostate cancer is significant. PMID:24391995

  20. Modelling the delay between pharmacokinetics and EEG effects of morphine in rats: binding kinetic versus effect compartment models.

    PubMed

    de Witte, Wilhelmus E A; Rottschäfer, Vivi; Danhof, Meindert; van der Graaf, Piet H; Peletier, Lambertus A; de Lange, Elizabeth C M

    2018-05-18

    Drug-target binding kinetics (as determined by association and dissociation rate constants, k on and k off ) can be an important determinant of the kinetics of drug action. However, the effect compartment model is used most frequently instead of a target binding model to describe hysteresis. Here we investigate when the drug-target binding model should be used in lieu of the effect compartment model. The utility of the effect compartment (EC), the target binding kinetics (TB) and the combined effect compartment-target binding kinetics (EC-TB) model were tested on either plasma (EC PL , TB PL and EC-TB PL ) or brain extracellular fluid (ECF) (EC ECF , TB ECF and EC-TB ECF ) morphine concentrations and EEG amplitude in rats. It was also analyzed when a significant shift in the time to maximal target occupancy (Tmax TO ) with increasing dose, the discriminating feature between the TB and EC model, occurs in the TB model. All TB models assumed a linear relationship between target occupancy and drug effect on the EEG amplitude. All three model types performed similarly in describing the morphine pharmacodynamics data, although the EC model provided the best statistical result. The analysis of the shift in Tmax TO (∆Tmax TO ) as a result of increasing dose revealed that ∆Tmax TO is decreasing towards zero if the k off is much smaller than the elimination rate constant or if the target concentration is larger than the initial morphine concentration. The results for the morphine PKPD modelling and the analysis of ∆Tmax TO indicate that the EC and TB models do not necessarily lead to different drug effect versus time curves for different doses if a delay between drug concentrations and drug effect (hysteresis) is described. Drawing mechanistic conclusions from successfully fitting one of these two models should therefore be avoided. Since the TB model can be informed by in vitro measurements of k on and k off , a target binding model should be considered more often for mechanistic modelling purposes.

  1. A path analysis model for explaining unsafe behavior in workplaces: the effect of perceived work pressure.

    PubMed

    Ghasemi, Fakhradin; Kalatpour, Omid; Moghimbeigi, Abbas; Mohhamadfam, Iraj

    2018-06-01

    Unsafe behavior is closely related to occupational accidents. Work pressure is one the main factors affecting employees' behavior. The aim of the present study was to provide a path analysis model for explaining how work pressure affects safety behavior. Using a self-administered questionnaire, six variables supposed to affect safety employees' behavior were measured. The path analysis model was constructed based on several hypotheses. The goodness of fit of the model was assessed using both absolute and comparative fit indices. Work pressure was determined not to influence safety behavior directly. However, it negatively influenced other variables. Group attitude and personal attitude toward safety were the main factors mediating the effect of work pressure on safety behavior. Among the variables investigated in the present study, group attitude, personal attitude and work pressure had the strongest effects on safety behavior. Managers should consider that in order to improve employees' safety behavior, work pressure should be reduced to a reasonable level, and concurrently a supportive environment, which ensures a positive group attitude toward safety, should be provided. Replication of the study is recommended.

  2. Evidence synthesis for medical decision making and the appropriate use of quality scores.

    PubMed

    Doi, Suhail A R

    2014-09-01

    Meta-analyses today continue to be run using conventional random-effects models that ignore tangible information from studies such as the quality of the studies involved, despite the expectation that results of better quality studies reflect more valid results. Previous research has suggested that quality scores derived from such quality appraisals are unlikely to be useful in meta-analysis, because they would produce biased estimates of effects that are unlikely to be offset by a variance reduction within the studied models. However, previous discussions took place in the context of such scores viewed in terms of their ability to maximize their association with both the magnitude and direction of bias. In this review, another look is taken at this concept, this time asserting that probabilistic bias quantification is not possible or even required of quality scores when used in meta-analysis for redistribution of weights. The use of such a model is contrasted with the conventional random effects model of meta-analysis to demonstrate why the latter is inadequate in the face of a properly specified quality score weighting method. © 2014 Marshfield Clinic.

  3. Effects of Voice Coding and Speech Rate on a Synthetic Speech Display in a Telephone Information System

    DTIC Science & Technology

    1988-05-01

    Seeciv Limited- System for varying Senses term filter capacity output until some Figure 2. Original limited-capacity channel model (Frim Broadbent, 1958) S...2 Figure 2. Original limited-capacity channel model (From Broadbent, 1958) .... 10 Figure 3. Experimental...unlimited variety of human voices for digital recording sources. Synthesis by Analysis Analysis-synthesis methods electronically model the human voice

  4. Robust Linear Models for Cis-eQTL Analysis.

    PubMed

    Rantalainen, Mattias; Lindgren, Cecilia M; Holmes, Christopher C

    2015-01-01

    Expression Quantitative Trait Loci (eQTL) analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives), and to some extent also type I errors (false positives). Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.

  5. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.

    PubMed

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf

    2018-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.

  6. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models

    PubMed Central

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf

    2017-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977

  7. A comparison between standard methods and structural nested modelling when bias from a healthy worker survivor effect is suspected: an iron-ore mining cohort study.

    PubMed

    Björ, Ove; Damber, Lena; Jonsson, Håkan; Nilsson, Tohr

    2015-07-01

    Iron-ore miners are exposed to extremely dusty and physically arduous work environments. The demanding activities of mining select healthier workers with longer work histories (ie, the Healthy Worker Survivor Effect (HWSE)), and could have a reversing effect on the exposure-response association. The objective of this study was to evaluate an iron-ore mining cohort to determine whether the effect of respirable dust was confounded by the presence of an HWSE. When an HWSE exists, standard modelling methods, such as Cox regression analysis, produce biased results. We compared results from g-estimation of accelerated failure-time modelling adjusted for HWSE with corresponding unadjusted Cox regression modelling results. For all-cause mortality when adjusting for the HWSE, cumulative exposure from respirable dust was associated with a 6% decrease of life expectancy if exposed ≥15 years, compared with never being exposed. Respirable dust continued to be associated with mortality after censoring outcomes known to be associated with dust when adjusting for the HWSE. In contrast, results based on Cox regression analysis did not support that an association was present. The adjustment for the HWSE made a difference when estimating the risk of mortality from respirable dust. The results of this study, therefore, support the recommendation that standard methods of analysis should be complemented with structural modelling analysis techniques, such as g-estimation of accelerated failure-time modelling, to adjust for the HWSE. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Addition of docetaxel and/or zoledronic acid to standard of care for hormone-naive prostate cancer: a cost-effectiveness analysis.

    PubMed

    Zhang, Pengfei; Wen, Feng; Fu, Ping; Yang, Yu; Li, Qiu

    2017-07-31

    The effectiveness of the addition of docetaxel and/or zoledronic acid to the standard of care (SOC) for hormone-naive prostate cancer has been evaluated in the STAMPEDE trial. The object of the present analysis was to evaluate the cost-effectiveness of these treatment options in the treatment of advanced hormone-naive prostate cancer in China. A cost-effectiveness analysis using a Markov model was carried out from the Chinese societal perspective. The efficacy data were obtained from the STAMPEDE trial and health utilities were derived from previous studies. Transition probabilities were calculated based on the survival in each group. The primary endpoint in the analysis was the incremental cost-effectiveness ratio (ICER), and model uncertainties were explored by 1-way sensitivity analysis and probabilistic sensitivity analysis. SOC alone generated an effectiveness of 2.65 quality-adjusted life years (QALYs) at a lifetime cost of $20,969.23. At a cost of $25,001.34, SOC plus zoledronic acid was associated with 2.69 QALYs, resulting in an ICER of $100,802.75/QALY compared with SOC alone. SOC plus docetaxel gained an effectiveness of 2.85 QALYs at a cost of $28,764.66, while the effectiveness and cost data in the SOC plus zoledronic acid/docetaxel group were 2.78 QALYs and $32,640.95. Based on the results of the analysis, SOC plus zoledronic acid, SOC plus docetaxel, and SOC plus zoledronic acid/docetaxel are unlikely to be cost-effective options in patients with advanced hormone-naive prostate cancer compared with SOC alone.

  9. AADL Fault Modeling and Analysis Within an ARP4761 Safety Assessment

    DTIC Science & Technology

    2014-10-01

    Analysis Generator 27 3.2.3 Mapping to OpenFTA Format File 27 3.2.4 Mapping to Generic XML Format 28 3.2.5 AADL and FTA Mapping Rules 28 3.2.6 Issues...PSSA), System Safety Assessment (SSA), Common Cause Analysis (CCA), Fault Tree Analysis ( FTA ), Failure Modes and Effects Analysis (FMEA), Failure...Modes and Effects Summary, Mar - kov Analysis (MA), and Dependence Diagrams (DDs), also referred to as Reliability Block Dia- grams (RBDs). The

  10. Value of Information Analysis Applied to the Economic Evaluation of Interventions Aimed at Reducing Juvenile Delinquency: An Illustration.

    PubMed

    Eeren, Hester V; Schawo, Saskia J; Scholte, Ron H J; Busschbach, Jan J V; Hakkaart, Leona

    2015-01-01

    To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention. Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year. At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others. Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents.

  11. Rigorous derivation of the effective model describing a non-isothermal fluid flow in a vertical pipe filled with porous medium

    NASA Astrophysics Data System (ADS)

    Beneš, Michal; Pažanin, Igor

    2018-03-01

    This paper reports an analytical investigation of non-isothermal fluid flow in a thin (or long) vertical pipe filled with porous medium via asymptotic analysis. We assume that the fluid inside the pipe is cooled (or heated) by the surrounding medium and that the flow is governed by the prescribed pressure drop between pipe's ends. Starting from the dimensionless Darcy-Brinkman-Boussinesq system, we formally derive a macroscopic model describing the effective flow at small Brinkman-Darcy number. The asymptotic approximation is given by the explicit formulae for the velocity, pressure and temperature clearly acknowledging the effects of the cooling (heating) and porous structure. The theoretical error analysis is carried out to indicate the order of accuracy and to provide a rigorous justification of the effective model.

  12. The Importance of Model Structure in the Cost-Effectiveness Analysis of Primary Care Interventions for the Management of Hypertension.

    PubMed

    Peñaloza-Ramos, Maria Cristina; Jowett, Sue; Sutton, Andrew John; McManus, Richard J; Barton, Pelham

    2018-03-01

    Management of hypertension can lead to significant reductions in blood pressure, thereby reducing the risk of cardiovascular disease. Modeling the course of cardiovascular disease is not without complications, and uncertainty surrounding the structure of a model will almost always arise once a choice of a model structure is defined. To provide a practical illustration of the impact on the results of cost-effectiveness of changing or adapting model structures in a previously published cost-utility analysis of a primary care intervention for the management of hypertension Targets and Self-Management for the Control of Blood Pressure in Stroke and at Risk Groups (TASMIN-SR). The case study assessed the structural uncertainty arising from model structure and from the exclusion of secondary events. Four alternative model structures were implemented. Long-term cost-effectiveness was estimated and the results compared with those from the TASMIN-SR model. The main cost-effectiveness results obtained in the TASMIN-SR study did not change with the implementation of alternative model structures. Choice of model type was limited to a cohort Markov model, and because of the lack of epidemiological data, only model 4 captured structural uncertainty arising from the exclusion of secondary events in the case study model. The results of this study indicate that the main conclusions drawn from the TASMIN-SR model of cost-effectiveness were robust to changes in model structure and the inclusion of secondary events. Even though one of the models produced results that were different to those of TASMIN-SR, the fact that the main conclusions were identical suggests that a more parsimonious model may have sufficed. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. Verification of relationship model between Korean new elderly class's recovery resilience and productive aging.

    PubMed

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-12-01

    The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.

  14. Verification of relationship model between Korean new elderly class’s recovery resilience and productive aging

    PubMed Central

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-01-01

    The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383

  15. Effects of vicarious punishment: a meta-analysis.

    PubMed

    Malouff, John; Thorsteinsson, Einar; Schutte, Nicola; Rooke, Sally Erin

    2009-07-01

    Vicarious punishment involves observing a model exhibit a behavior that leads to punishment for the model. If observers then exhibit the behavior at a lower rate than do individuals in a control group, vicarious punishment occurred. The authors report the results of a meta-analysis of studies that tested for vicarious-punishment effects. Across 21 research samples and 876 participants, the viewing of a model experiencing punishment for a behavior led to a significantly lower level of the behavior by the observers, d = 0.58. Vicarious punishment occurred consistently with (a) live and filmed models, (b) severe and nonsevere punishment for the model, (c) positive punishment alone or positive plus negative punishment, (d) various types of behavior, (e) adults and children, and (f) male and female participants. The findings have implications for the use of models in reducing undesirable behavior.

  16. Using the MWC model to describe heterotropic interactions in hemoglobin

    PubMed Central

    Rapp, Olga

    2017-01-01

    Hemoglobin is a classical model allosteric protein. Research on hemoglobin parallels the development of key cooperativity and allostery concepts, such as the ‘all-or-none’ Hill formalism, the stepwise Adair binding formulation and the concerted Monod-Wymann-Changuex (MWC) allosteric model. While it is clear that the MWC model adequately describes the cooperative binding of oxygen to hemoglobin, rationalizing the effects of H+, CO2 or organophosphate ligands on hemoglobin-oxygen saturation using the same model remains controversial. According to the MWC model, allosteric ligands exert their effect on protein function by modulating the quaternary conformational transition of the protein. However, data fitting analysis of hemoglobin oxygen saturation curves in the presence or absence of inhibitory ligands persistently revealed effects on both relative oxygen affinity (c) and conformational changes (L), elementary MWC parameters. The recent realization that data fitting analysis using the traditional MWC model equation may not provide reliable estimates for L and c thus calls for a re-examination of previous data using alternative fitting strategies. In the current manuscript, we present two simple strategies for obtaining reliable estimates for MWC mechanistic parameters of hemoglobin steady-state saturation curves in cases of both evolutionary and physiological variations. Our results suggest that the simple MWC model provides a reasonable description that can also account for heterotropic interactions in hemoglobin. The results, moreover, offer a general roadmap for successful data fitting analysis using the MWC model. PMID:28793329

  17. Waveform model for an eccentric binary black hole based on the effective-one-body-numerical-relativity formalism

    NASA Astrophysics Data System (ADS)

    Cao, Zhoujian; Han, Wen-Biao

    2017-08-01

    Binary black hole systems are among the most important sources for gravitational wave detection. They are also good objects for theoretical research for general relativity. A gravitational waveform template is important to data analysis. An effective-one-body-numerical-relativity (EOBNR) model has played an essential role in the LIGO data analysis. For future space-based gravitational wave detection, many binary systems will admit a somewhat orbit eccentricity. At the same time, the eccentric binary is also an interesting topic for theoretical study in general relativity. In this paper, we construct the first eccentric binary waveform model based on an effective-one-body-numerical-relativity framework. Our basic assumption in the model construction is that the involved eccentricity is small. We have compared our eccentric EOBNR model to the circular one used in the LIGO data analysis. We have also tested our eccentric EOBNR model against another recently proposed eccentric binary waveform model; against numerical relativity simulation results; and against perturbation approximation results for extreme mass ratio binary systems. Compared to numerical relativity simulations with an eccentricity as large as about 0.2, the overlap factor for our eccentric EOBNR model is better than 0.98 for all tested cases, including spinless binary and spinning binary, equal mass binary, and unequal mass binary. Hopefully, our eccentric model can be the starting point to develop a faithful template for future space-based gravitational wave detectors.

  18. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Clifford W.; Martin, Curtis E.

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature;more » (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.« less

  19. KSC VAB Aeroacoustic Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.

    2010-01-01

    NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.

  20. Validation of methods to control for immortal time bias in a pharmacoepidemiologic analysis of renin-angiotensin system inhibitors in type 2 diabetes.

    PubMed

    Yang, Xilin; Kong, Alice Ps; Luk, Andrea Oy; Ozaki, Risa; Ko, Gary Tc; Ma, Ronald Cw; Chan, Juliana Cn; So, Wing Yee

    2014-01-01

    Pharmacoepidemiologic analysis can confirm whether drug efficacy in a randomized controlled trial (RCT) translates to effectiveness in real settings. We examined methods used to control for immortal time bias in an analysis of renin-angiotensin system (RAS) inhibitors as the reference cardioprotective drug. We analyzed data from 3928 patients with type 2 diabetes who were recruited into the Hong Kong Diabetes Registry between 1996 and 2005 and followed up to July 30, 2005. Different Cox models were used to obtain hazard ratios (HRs) for cardiovascular disease (CVD) associated with RAS inhibitors. These HRs were then compared to the HR of 0.92 reported in a recent meta-analysis of RCTs. During a median follow-up period of 5.45 years, 7.23% (n = 284) patients developed CVD and 38.7% (n = 1519) were started on RAS inhibitors, with 39.1% of immortal time among the users. In multivariable analysis, time-dependent drug-exposure Cox models and Cox models that moved immortal time from users to nonusers both severely inflated the HR, and time-fixed models that included immortal time deflated the HR. Use of time-fixed Cox models that excluded immortal time resulted in a HR of only 0.89 (95% CI, 0.68-1.17) for CVD associated with RAS inhibitors, which is closer to the values reported in RCTs. In pharmacoepidemiologic analysis, time-dependent drug exposure models and models that move immortal time from users to nonusers may introduce substantial bias in investigations of the effects of RAS inhibitors on CVD in type 2 diabetes.

  1. Application of ISO22000 and Failure Mode and Effect Analysis (fmea) for Industrial Processing of Poultry Products

    NASA Astrophysics Data System (ADS)

    Varzakas, Theodoros H.; Arvanitoyannis, Ioannis S.

    Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of poultry slaughtering and manufacturing. In this work comparison of ISO22000 analysis with HACCP is carried out over poultry slaughtering, processing and packaging. Critical Control points and Prerequisite programs (PrPs) have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram).

  2. The hydrodynamic basis of the vacuum cleaner effect in continuous-flow PCNL instruments: an empiric approach and mathematical model.

    PubMed

    Mager, R; Balzereit, C; Gust, K; Hüsch, T; Herrmann, T; Nagele, U; Haferkamp, A; Schilling, D

    2016-05-01

    Passive removal of stone fragments in the irrigation stream is one of the characteristics in continuous-flow PCNL instruments. So far the physical principle of this so-called vacuum cleaner effect has not been fully understood yet. The aim of the study was to empirically prove the existence of the vacuum cleaner effect and to develop a physical hypothesis and generate a mathematical model for this phenomenon. In an empiric approach, common low-pressure PCNL instruments and conventional PCNL sheaths were tested using an in vitro model. Flow characteristics were visualized by coloring of irrigation fluid. Influence of irrigation pressure, sheath diameter, sheath design, nephroscope design and position of the nephroscope was assessed. Experiments were digitally recorded for further slow-motion analysis to deduce a physical model. In each tested nephroscope design, we could observe the vacuum cleaner effect. Increase in irrigation pressure and reduction in cross section of sheath sustained the effect. Slow-motion analysis of colored flow revealed a synergism of two effects causing suction and transportation of the stone. For the first time, our model showed a flow reversal in the sheath as an integral part of the origin of the stone transportation during vacuum cleaner effect. The application of Bernoulli's equation provided the explanation of these effects and confirmed our experimental results. We widen the understanding of PCNL with a conclusive physical model, which explains fluid mechanics of the vacuum cleaner effect.

  3. Cost-effectiveness analysis of a patient-centered care model for management of psoriasis.

    PubMed

    Parsi, Kory; Chambers, Cindy J; Armstrong, April W

    2012-04-01

    Cost-effectiveness analyses help policymakers make informed decisions regarding funding allocation of health care resources. Cost-effectiveness analysis of technology-enabled models of health care delivery is necessary to assess sustainability of novel online, patient-centered health care models. We sought to compare cost-effectiveness of conventional in-office care with a patient-centered, online model for follow-up treatment of patients with psoriasis. Cost-effectiveness analysis was performed from a societal perspective on a randomized controlled trial comparing a patient-centered online model with in-office visits for treatment of patients with psoriasis during a 24-week period. Quality-adjusted life expectancy was calculated using the life table method. Costs were generated from the original study parameters and national averages for salaries and services. No significant difference existed in the mean change in Dermatology Life Quality Index scores between the two groups (online: 3.51 ± 4.48 and in-office: 3.88 ± 6.65, P value = .79). Mean improvement in quality-adjusted life expectancy was not significantly different between the groups (P value = .93), with a gain of 0.447 ± 0.48 quality-adjusted life years for the online group and a gain of 0.463 ± 0.815 quality-adjusted life years for the in-office group. The cost of follow-up psoriasis care with online visits was 1.7 times less than the cost of in-person visits ($315 vs $576). Variations in travel time existed among patients depending on their distance from the dermatologist's office. From a societal perspective, the patient-centered online care model appears to be cost saving, while maintaining similar effectiveness to standard in-office care. Copyright © 2011 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.

  4. Developing a model for effective leadership in healthcare: a concept mapping approach

    PubMed Central

    Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison MB; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C

    2017-01-01

    Purpose Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group’s ideas) to identify stakeholders’ mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Methods Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. Results A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were “Acting with Personal Integrity”, “Communicating Effectively”, “Acting with Professional Ethical Values”, “Pursuing Excellence”, “Building and Maintaining Relationships”, and “Thinking Critically”. Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Conclusion Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research. PMID:29355249

  5. Nonlinear Poisson equation for heterogeneous media.

    PubMed

    Hu, Langhua; Wei, Guo-Wei

    2012-08-22

    The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  6. Equivalent plate modeling for conceptual design of aircraft wing structures

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.

    1995-01-01

    This paper describes an analysis method that generates conceptual-level design data for aircraft wing structures. A key requirement is that this data must be produced in a timely manner so that is can be used effectively by multidisciplinary synthesis codes for performing systems studies. Such a capability is being developed by enhancing an equivalent plate structural analysis computer code to provide a more comprehensive, robust and user-friendly analysis tool. The paper focuses on recent enhancements to the Equivalent Laminated Plate Solution (ELAPS) analysis code that significantly expands the modeling capability and improves the accuracy of results. Modeling additions include use of out-of-plane plate segments for representing winglets and advanced wing concepts such as C-wings along with a new capability for modeling the internal rib and spar structure. The accuracy of calculated results is improved by including transverse shear effects in the formulation and by using multiple sets of assumed displacement functions in the analysis. Typical results are presented to demonstrate these new features. Example configurations include a C-wing transport aircraft, a representative fighter wing and a blended-wing-body transport. These applications are intended to demonstrate and quantify the benefits of using equivalent plate modeling of wing structures during conceptual design.

  7. Stochastic Control of Multi-Scale Networks: Modeling, Analysis and Algorithms

    DTIC Science & Technology

    2014-10-20

    Theory, (02 2012): 0. doi: B. T. Swapna, Atilla Eryilmaz, Ness B. Shroff. Throughput-Delay Analysis of Random Linear Network Coding for Wireless ... Wireless Sensor Networks and Effects of Long-Range Dependent Data, Sequential Analysis , (10 2012): 0. doi: 10.1080/07474946.2012.719435 Stefano...Sequential Analysis , (10 2012): 0. doi: John S. Baras, Shanshan Zheng. Sequential Anomaly Detection in Wireless Sensor Networks andEffects of Long

  8. Pilot modeling and closed-loop analysis of flexible aircraft in the pitch tracking task

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.

    1983-01-01

    The issue addressed in the appropriate modeling technique for pilot vehicle analysis of large flexible aircraft, when the frequency separation between the rigid-body mode and the dynamic aeroelastic modes is reduced. This situation was shown to have significant effects on pitch-tracking performance and subjective rating of the task, obtained via fixed base simulation. Further, the dynamics in these cases are not well modeled with a rigid-body-like model obtained by including only 'static elastic' effects, for example. It is shown that pilot/vehicle analysis of this data supports the hypothesis that an appropriate pilot-model structure is an optimal-control pilot model of full order. This is in contrast to the contention that a representative model is of reduced order when the subject is controlling high-order dynamics as in a flexible vehicle. The key appears to be in the correct assessment of the pilot's objective of attempting to control 'rigid-body' vehicle response, a response that must be estimated by the pilot from observations contaminated by aeroelastic dynamics. Finally, a model-based metric is shown to correlate well with the pilot's subjective ratings.

  9. Moment method analysis of linearly tapered slot antennas

    NASA Technical Reports Server (NTRS)

    Koeksal, Adnan

    1993-01-01

    A method of moments (MOM) model for the analysis of the Linearly Tapered Slot Antenna (LTSA) is developed and implemented. The model employs an unequal size rectangular sectioning for conducting parts of the antenna. Piecewise sinusoidal basis functions are used for the expansion of conductor current. The effect of the dielectric is incorporated in the model by using equivalent volume polarization current density and solving the equivalent problem in free-space. The feed section of the antenna including the microstripline is handled rigorously in the MOM model by including slotline short circuit and microstripline currents among the unknowns. Comparison with measurements is made to demonstrate the validity of the model for both the air case and the dielectric case. Validity of the model is also verified by extending the model to handle the analysis of the skew-plate antenna and comparing the results to those of a skew-segmentation modeling results of the same structure and to available data in the literature. Variation of the radiation pattern for the air LTSA with length, height, and taper angle is investigated, and the results are tabulated. Numerical results for the effect of the dielectric thickness and permittivity are presented.

  10. Analysis of baseline, average, and longitudinally measured blood pressure data using linear mixed models.

    PubMed

    Hossain, Ahmed; Beyene, Joseph

    2014-01-01

    This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.

  11. [Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].

    PubMed

    Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping

    2014-06-01

    The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.

  12. Overview of MSFC AMSD Integrated Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Russell, Kevin (Technical Monitor)

    2002-01-01

    Structural, thermal, dynamic, and optical models of the NGST AMSD mirror assemblies are being finalized and integrated for predicting cryogenic vacuum test performance of the developing designs. Analyzers in use by the MSFC Modeling and Analysis Team are identified, with overview of approach to integrate simulated effects. Guidelines to verify the individual models and calibration cases for comparison with the vendors' analyses are presented. In addition, baseline and proposed additional scenarios for the cryogenic vacuum testing are briefly described.

  13. Mechanisms and mediation in survival analysis: towards an integrated analytical framework.

    PubMed

    Pratschke, Jonathan; Haase, Trutz; Comber, Harry; Sharp, Linda; de Camargo Cancela, Marianna; Johnson, Howard

    2016-02-29

    A wide-ranging debate has taken place in recent years on mediation analysis and causal modelling, raising profound theoretical, philosophical and methodological questions. The authors build on the results of these discussions to work towards an integrated approach to the analysis of research questions that situate survival outcomes in relation to complex causal pathways with multiple mediators. The background to this contribution is the increasingly urgent need for policy-relevant research on the nature of inequalities in health and healthcare. The authors begin by summarising debates on causal inference, mediated effects and statistical models, showing that these three strands of research have powerful synergies. They review a range of approaches which seek to extend existing survival models to obtain valid estimates of mediation effects. They then argue for an alternative strategy, which involves integrating survival outcomes within Structural Equation Models via the discrete-time survival model. This approach can provide an integrated framework for studying mediation effects in relation to survival outcomes, an issue of great relevance in applied health research. The authors provide an example of how these techniques can be used to explore whether the social class position of patients has a significant indirect effect on the hazard of death from colon cancer. The results suggest that the indirect effects of social class on survival are substantial and negative (-0.23 overall). In addition to the substantial direct effect of this variable (-0.60), its indirect effects account for more than one quarter of the total effect. The two main pathways for this indirect effect, via emergency admission (-0.12), on the one hand, and hospital caseload, on the other, (-0.10) are of similar size. The discrete-time survival model provides an attractive way of integrating time-to-event data within the field of Structural Equation Modelling. The authors demonstrate the efficacy of this approach in identifying complex causal pathways that mediate the effects of a socio-economic baseline covariate on the hazard of death from colon cancer. The results show that this approach has the potential to shed light on a class of research questions which is of particular relevance in health research.

  14. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, directmore » and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.« less

  15. Analysis and design of a capsule landing system and surface vehicle control system for Mars exploration

    NASA Technical Reports Server (NTRS)

    Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. V.; Yerazunis, S. W.

    1973-01-01

    Problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars are reported. Problem areas include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis, terrain modeling and path selection; and chemical analysis of specimens. These tasks are summarized: vehicle model design, mathematical model of vehicle dynamics, experimental vehicle dynamics, obstacle negotiation, electrochemical controls, remote control, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, and chromatograph model evaluation and improvement.

  16. Review of rigorous coupled-wave analysis and of homogeneous effective medium approximations for high spatial-frequency surface-relief gratings

    NASA Technical Reports Server (NTRS)

    Glytsis, Elias N.; Brundrett, David L.; Gaylord, Thomas K.

    1993-01-01

    A review of the rigorous coupled-wave analysis as applied to the diffraction of electro-magnetic waves by gratings is presented. The analysis is valid for any polarization, angle of incidence, and conical diffraction. Cascaded and/or multiplexed gratings as well as material anisotropy can be incorporated under the same formalism. Small period rectangular groove gratings can also be modeled using approximately equivalent uniaxial homogeneous layers (effective media). The ordinary and extraordinary refractive indices of these layers depend on the gratings filling factor, the refractive indices of the substrate and superstrate, and the ratio of the freespace wavelength to grating period. Comparisons of the homogeneous effective medium approximations with the rigorous coupled-wave analysis are presented. Antireflection designs (single-layer or multilayer) using the effective medium models are presented and compared. These ultra-short period antireflection gratings can also be used to produce soft x-rays. Comparisons of the rigorous coupled-wave analysis with experimental results on soft x-ray generation by gratings are also included.

  17. JWST ISIM Distortion Analysis Challenge

    NASA Technical Reports Server (NTRS)

    Cifie, Emmanuel; Matzinger, Liz; Kuhn, Jonathan; Fan, Terry

    2004-01-01

    Very tight distortion requirements are imposed on the JWST's ISM structure due to the sensitivity of the telescope's mirror segment and science instrument positioning. The ISIM structure is a three dimensional truss with asymmetric gusseting and metal fittings. One of the primary challenges for ISIM's analysis team is predicting the thermal distortion of the structure both from the bulk cooldown from ambient to cryo, and the smaller temperature changes within the cryogenic operating environment. As a first cut to estimate thermal distortions, a finite element model of bar elements was created. Elements representing joint areas and metal fittings use effective properties that match the behavior of the stack-up of the composite tube, gusset and adhesive under mechanical and thermal loads. These properties were derived by matching tip deflections of a solid model simplified T-joint. Because of the structure s asymmetric gusseting, this effective property model is a first attempt at predicting rotations that cannot be captured with a smeared CTE approach. In addition to the finite element analysis, several first order calculations have been performed to gauge the feasibility of the material design. Because of the stringent thermal distortion requirements at cryogenic temperatures, a composite tube material with near zero or negative CTE is required. A preliminary hand analysis of the contribution of the various components along the distortion path between FGS and the other instruments, neglecting second order effects were examined. A plot of bounding tube longitudinal and transverse CTEs for thermal stability requirements was generated to help determine the feasibility of meeting these requirements. This analysis is a work in progress en route to a large degree of freedom hi-fidelity FEA model for distortion analysis. Methods of model reduction, such as superelements, are currently being investigated.

  18. Analysis of Atmospheric Aerosol Data Sets and Application of Radiative Transfer Models to Compute Aerosol Effects

    NASA Technical Reports Server (NTRS)

    Schmid, Beat; Bergstrom, Robert W.; Redemann, Jens

    2002-01-01

    This report is the final report for "Analysis of Atmospheric Aerosol Data Sets and Application of Radiative Transfer Models to Compute Aerosol Effects". It is a bibliographic compilation of 29 peer-reviewed publications (published, in press or submitted) produced under this Cooperative Agreement and 30 first-authored conference presentations. The tasks outlined in the various proposals are listed below with a brief comment as to the research performed. Copies of title/abstract pages of peer-reviewed publications are attached.

  19. Linear mixed-effects models for within-participant psychology experiments: an introductory tutorial and free, graphical user interface (LMMgui).

    PubMed

    Magezi, David A

    2015-01-01

    Linear mixed-effects models (LMMs) are increasingly being used for data analysis in cognitive neuroscience and experimental psychology, where within-participant designs are common. The current article provides an introductory review of the use of LMMs for within-participant data analysis and describes a free, simple, graphical user interface (LMMgui). LMMgui uses the package lme4 (Bates et al., 2014a,b) in the statistical environment R (R Core Team).

  20. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  1. A method for fitting regression splines with varying polynomial order in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Stewart, Paul W; MacDougall, James E; Helms, Ronald W

    2006-02-15

    The linear mixed model has become a widely used tool for longitudinal analysis of continuous variables. The use of regression splines in these models offers the analyst additional flexibility in the formulation of descriptive analyses, exploratory analyses and hypothesis-driven confirmatory analyses. We propose a method for fitting piecewise polynomial regression splines with varying polynomial order in the fixed effects and/or random effects of the linear mixed model. The polynomial segments are explicitly constrained by side conditions for continuity and some smoothness at the points where they join. By using a reparameterization of this explicitly constrained linear mixed model, an implicitly constrained linear mixed model is constructed that simplifies implementation of fixed-knot regression splines. The proposed approach is relatively simple, handles splines in one variable or multiple variables, and can be easily programmed using existing commercial software such as SAS or S-plus. The method is illustrated using two examples: an analysis of longitudinal viral load data from a study of subjects with acute HIV-1 infection and an analysis of 24-hour ambulatory blood pressure profiles.

  2. Protons in head-and-neck cancer: bridging the gap of evidence.

    PubMed

    Ramaekers, Bram L T; Grutters, Janneke P C; Pijls-Johannesma, Madelon; Lambin, Philippe; Joore, Manuela A; Langendijk, Johannes A

    2013-04-01

    To use Normal Tissue Complication Probability (NTCP) models and comparative planning studies to explore the (cost-)effectiveness of swallowing sparing intensity modulated proton radiotherapy (IMPT) compared with swallowing sparing intensity modulated radiotherapy with photons (IMRT) in head and neck cancer (HNC). A Markov model was constructed to examine and compare the costs and quality-adjusted life years (QALYs) of the following strategies: (1) IMPT for all patients; (2) IMRT for all patients; and (3) IMPT if efficient. The assumption of equal survival for IMPT and IMRT in the base case analysis was relaxed in a sensitivity analysis. Intensity modulated proton radiation therapy and IMRT for all patients yielded 6.620 and 6.520 QALYs and cost €50,989 and €41,038, respectively. Intensity modulated proton radiation therapy if efficient yielded 6.563 QALYs and cost €43,650. The incremental cost-effectiveness ratio of IMPT if efficient versus IMRT for all patients was €60,278 per QALY gained. In the sensitivity analysis, IMRT was more effective (0.967 QALYs) and less expensive (€8218) and thus dominated IMPT for all patients. Cost-effectiveness analysis based on normal tissue complication probability models and planning studies proved feasible and informative and enables the analysis of individualized strategies. The increased effectiveness of IMPT does not seem to outweigh the higher costs for all head-and-neck cancer patients. However, when assuming equal survival among both modalities, there seems to be value in identifying those patients for whom IMPT is cost-effective. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Modeling spanwise nonuniformity in the cross-sectional analysis of composite beams

    NASA Astrophysics Data System (ADS)

    Ho, Jimmy Cheng-Chung

    Spanwise nonuniformity effects are modeled in the cross-sectional analysis of beam theory. This modeling adheres to an established numerical framework on cross-sectional analysis of uniform beams with arbitrary cross-sections. This framework is based on two concepts: decomposition of the rotation tensor and the variational-asymptotic method. Allowance of arbitrary materials and geometries in the cross-section is from discretization of the warping field by finite elements. By this approach, dimensional reduction from three-dimensional elasticity is performed rigorously and the sectional strain energy is derived to be asymptotically-correct. Elastic stiffness matrices are derived for inputs into the global beam analysis. Recovery relations for the displacement, stress, and strain fields are also derived with care to be consistent with the energy. Spanwise nonuniformity effects appear in the form of pointwise and sectionwise derivatives, which are approximated by finite differences. The formulation also accounts for the effects of spanwise variations in initial twist and/or curvature. A linearly tapered isotropic strip is analyzed to demonstrate spanwise nonuniformity effects on the cross-sectional analysis. The analysis is performed analytically by the variational-asymptotic method. Results from beam theory are validated against solutions from plane stress elasticity. These results demonstrate that spanwise nonuniformity effects become significant as the rate at which the cross-sections vary increases. The modeling of transverse shear modes of deformation is accomplished by transforming the strain energy into generalized Timoshenko form. Approximations in this transformation procedure from previous works, when applied to uniform beams, are identified. The approximations are not used in the present work so as to retain more accuracy. Comparison of present results with those previously published shows that these approximations sometimes change the results measurably and thus are inappropriate. Static and dynamic results, from the global beam analysis, are calculated to show the differences between using stiffness constants from previous works and the present work. As a form of validation of the transformation procedure, calculations from the global beam analysis of initially twisted isotropic beams from using curvilinear coordinate axes featuring twist are shown to be equivalent to calculations using Cartesian coordinates.

  4. What do we mean by sensitivity analysis? The need for comprehensive characterization of "global" sensitivity in Earth and Environmental systems models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2015-05-01

    Sensitivity analysis is an essential paradigm in Earth and Environmental Systems modeling. However, the term "sensitivity" has a clear definition, based in partial derivatives, only when specified locally around a particular point (e.g., optimal solution) in the problem space. Accordingly, no unique definition exists for "global sensitivity" across the problem space, when considering one or more model responses to different factors such as model parameters or forcings. A variety of approaches have been proposed for global sensitivity analysis, based on different philosophies and theories, and each of these formally characterizes a different "intuitive" understanding of sensitivity. These approaches focus on different properties of the model response at a fundamental level and may therefore lead to different (even conflicting) conclusions about the underlying sensitivities. Here we revisit the theoretical basis for sensitivity analysis, summarize and critically evaluate existing approaches in the literature, and demonstrate their flaws and shortcomings through conceptual examples. We also demonstrate the difficulty involved in interpreting "global" interaction effects, which may undermine the value of existing interpretive approaches. With this background, we identify several important properties of response surfaces that are associated with the understanding and interpretation of sensitivities in the context of Earth and Environmental System models. Finally, we highlight the need for a new, comprehensive framework for sensitivity analysis that effectively characterizes all of the important sensitivity-related properties of model response surfaces.

  5. CFD Analysis of an Installation Used to Measure the Skin-Friction Penalty of Acoustic Treatments

    NASA Technical Reports Server (NTRS)

    Spalart, Philippe R.; Garbaruk, Andrey; Howerton, Brian M.

    2017-01-01

    There is a drive to devise acoustic treatments with reduced skin-friction and therefore fuel-burn penalty for engine nacelles on commercial airplanes. The studies have been experimental, and the effects on skin-friction are deduced from measurements of the pressure drop along a duct. We conduct a detailed CFD analysis of the installation, for two purposes. The first is to predict the effects of the finite size of the rig, including its near-square cross-section and the moderate length of the treated patch; this introduces transient and blockage effects, which have not been included so far in the analysis. In addition, the flow is compressible, so that even with homogeneous surface conditions, it is not homogeneous in the streamwise direction. The second purpose is to extract an effective sand-grain roughness size for a particular liner, which in turn can be used in a CFD analysis of the aircraft, leading to actual predictions of the effect of acoustic treatments on fuel burn in service. The study is entirely based on classical turbulence models, with an appropriate modification for effective roughness effects, rather than directly modeling the liners.

  6. The use of mixed effects ANCOVA to characterize vehicle emission profiles

    DOT National Transportation Integrated Search

    2000-09-01

    A mixed effects analysis of covariance model to characterize mileage dependent emissions profiles for any given group of vehicles having a common model design is used in this paper. These types of evaluations are used by the U.S. Environmental Protec...

  7. Interventions to Improve Medication Adherence among Older Adults: Meta-Analysis of Adherence Outcomes among Randomized Controlled Trials

    ERIC Educational Resources Information Center

    Conn, Vicki S.; Hafdahl, Adam R.; Cooper, Pamela S.; Ruppar, Todd M.; Mehr, David R.; Russell, Cynthia L.

    2009-01-01

    Purpose: This study investigated the effectiveness of interventions to improve medication adherence (MA) in older adults. Design and Methods: Meta-analysis was used to synthesize results of 33 published and unpublished randomized controlled trials. Random-effects models were used to estimate overall mean effect sizes (ESs) for MA, knowledge,…

  8. Percutaneous Trigger Finger Release: A Cost-effectiveness Analysis.

    PubMed

    Gancarczyk, Stephanie M; Jang, Eugene S; Swart, Eric P; Makhni, Eric C; Kadiyala, Rajendra Kumar

    2016-07-01

    Percutaneous trigger finger releases (TFRs) performed in the office setting are becoming more prevalent. This study compares the costs of in-hospital open TFRs, open TFRs performed in ambulatory surgical centers (ASCs), and in-office percutaneous releases. An expected-value decision-analysis model was constructed from the payer perspective to estimate total costs of the three competing treatment strategies for TFR. Model parameters were estimated based on the best available literature and were tested using multiway sensitivity analysis. Percutaneous TFR performed in the office and then, if needed, revised open TFR performed in the ASC, was the most cost-effective strategy, with an attributed cost of $603. The cost associated with an initial open TFR performed in the ASC was approximately 7% higher. Initial open TFR performed in the hospital was the least cost-effective, with an attributed cost nearly twice that of primary percutaneous TFR. An initial attempt at percutaneous TFR is more cost-effective than an open TFR. Currently, only about 5% of TFRs are performed in the office; therefore, a substantial opportunity exists for cost savings in the future. Decision model level II.

  9. Bayesian dynamic mediation analysis.

    PubMed

    Huang, Jing; Yuan, Ying

    2017-12-01

    Most existing methods for mediation analysis assume that mediation is a stationary, time-invariant process, which overlooks the inherently dynamic nature of many human psychological processes and behavioral activities. In this article, we consider mediation as a dynamic process that continuously changes over time. We propose Bayesian multilevel time-varying coefficient models to describe and estimate such dynamic mediation effects. By taking the nonparametric penalized spline approach, the proposed method is flexible and able to accommodate any shape of the relationship between time and mediation effects. Simulation studies show that the proposed method works well and faithfully reflects the true nature of the mediation process. By modeling mediation effect nonparametrically as a continuous function of time, our method provides a valuable tool to help researchers obtain a more complete understanding of the dynamic nature of the mediation process underlying psychological and behavioral phenomena. We also briefly discuss an alternative approach of using dynamic autoregressive mediation model to estimate the dynamic mediation effect. The computer code is provided to implement the proposed Bayesian dynamic mediation analysis. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  11. General model for the pointing error analysis of Risley-prism system based on ray direction deviation in light refraction

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Yuan, Yan; Su, Lijuan; Huang, Fengzhen; Bai, Qing

    2016-09-01

    The Risley-prism-based light beam steering apparatus delivers superior pointing accuracy and it is used in imaging LIDAR and imaging microscopes. A general model for pointing error analysis of the Risley prisms is proposed in this paper, based on ray direction deviation in light refraction. This model captures incident beam deviation, assembly deflections, and prism rotational error. We derive the transmission matrixes of the model firstly. Then, the independent and cumulative effects of different errors are analyzed through this model. Accuracy study of the model shows that the prediction deviation of pointing error for different error is less than 4.1×10-5° when the error amplitude is 0.1°. Detailed analyses of errors indicate that different error sources affect the pointing accuracy to varying degree, and the major error source is the incident beam deviation. The prism tilting has a relative big effect on the pointing accuracy when prism tilts in the principal section. The cumulative effect analyses of multiple errors represent that the pointing error can be reduced by tuning the bearing tilting in the same direction. The cumulative effect of rotational error is relative big when the difference of these two prism rotational angles equals 0 or π, while it is relative small when the difference equals π/2. The novelty of these results suggests that our analysis can help to uncover the error distribution and aid in measurement calibration of Risley-prism systems.

  12. Accounting for dropout in xenografted tumour efficacy studies: integrated endpoint analysis, reduced bias and better use of animals.

    PubMed

    Martin, Emma C; Aarons, Leon; Yates, James W T

    2016-07-01

    Xenograft studies are commonly used to assess the efficacy of new compounds and characterise their dose-response relationship. Analysis often involves comparing the final tumour sizes across dose groups. This can cause bias, as often in xenograft studies a tumour burden limit (TBL) is imposed for ethical reasons, leading to the animals with the largest tumours being excluded from the final analysis. This means the average tumour size, particularly in the control group, is underestimated, leading to an underestimate of the treatment effect. Four methods to account for dropout due to the TBL are proposed, which use all the available data instead of only final observations: modelling, pattern mixture models, treating dropouts as censored using the M3 method and joint modelling of tumour growth and dropout. The methods were applied to both a simulated data set and a real example. All four proposed methods led to an improvement in the estimate of treatment effect in the simulated data. The joint modelling method performed most strongly, with the censoring method also providing a good estimate of the treatment effect, but with higher uncertainty. In the real data example, the dose-response estimated using the censoring and joint modelling methods was higher than the very flat curve estimated from average final measurements. Accounting for dropout using the proposed censoring or joint modelling methods allows the treatment effect to be recovered in studies where it may have been obscured due to dropout caused by the TBL.

  13. On 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Holt, R. V.; Huang, H.; Hartle, M.; Gellin, S.; Allen, D. H.; Haisler, W. E.

    1986-01-01

    Accomplishments are described for the 2-year program, to develop advanced 3-D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades and vanes. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulations models were developed; an eight-noded mid-surface shell element, a nine-noded mid-surface shell element and a twenty-noded isoparametric solid element. A separate computer program was developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  14. The 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.

    1992-01-01

    A two-year program to develop advanced 3D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades, and vanes is described. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulation models were developed: an eight-noded midsurface shell element; a nine-noded midsurface shell element; and a twenty-noded isoparametric solid element. A separate computer program has been developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  15. [The mediating role of organizational citizenship behavior between organizational justice and organizational effectiveness in nursing organizations].

    PubMed

    Park, Wall Yun; Yoon, Sook Hee

    2009-04-01

    This study was a secondary analysis to verify the mediating role of organizational citizenship behavior (OCB) between organizational justice (OJ) and organizational effectiveness (OE) in nursing organizations. The RN-BSNs and their colleagues in Seoul and Busan were subjects. The data was collected for 20 days between September 13 and October 2, 2004. Two hundred eighty three data sets were used for the final analysis. The fitness of models were tested using AMOS 5. The fitness of hypothetical model was moderate. Procedural Justice (PJ), Interaction Justice (IJ) and Distributive Justice (DJ) had direct effects on Job Satisfaction (JS), Organizational Commitment (OC) and Turnover Intention (TI) in OE, and indirect effects on JS, OC and TI mediated by OCB. The modified model improved with ideal fitness showed the causal relations among OE. In modified model, PJ, IJ and DJ had direct positive effects on OCB and JS and OC in OE, and indirect effects on JS and OC mediated by OCB. JS and OC in OE had a direct negative effect on TI. OCB mediated the relationship between OJ and OE, so the nursing managers should enhance OCB of the nurses in order to improve OE.

  16. Measuring the effect of fuel treatments on forest carbon using landscape risk analysis

    Treesearch

    A.A. Ager; M.A. Finney; A. McMahan; J. Carthcart

    2010-01-01

    Wildfire simulation modelling was used to examine whether fuel reduction treatments can potentially reduce future wildfire emissions and provide carbon benefits. In contrast to previous reports, the current study modelled landscape scale effects of fuel treatments on fire spread and intensity, and used a probabilistic framework to quantify wildfire effects on carbon...

  17. Characteristics and Models of Effective Professional Development: The Case of School Teachers in Qatar

    ERIC Educational Resources Information Center

    Abu-Tineh, Abdullah M.; Sadiq, Hissa M.

    2018-01-01

    The purpose of this study was to investigate the characteristics of effective professional development and effective models of professional development as perceived by school teachers in the State of Qatar. This study is quantitative in nature and was conducted using a survey methodology. Means, standard deviations, t-test, and one-way analysis of…

  18. The Effect of Attending Tutoring on Course Grades in Calculus I

    ERIC Educational Resources Information Center

    Rickard, Brian; Mills, Melissa

    2018-01-01

    Tutoring centres are common in universities in the United States, but there are few published studies that statistically examine the effects of tutoring on student success. This study utilizes multiple regression analysis to model the effect of tutoring attendance on final course grades in Calculus I. Our model predicted that every three visits to…

  19. Modeling Outcomes with Floor or Ceiling Effects: An Introduction to the Tobit Model

    ERIC Educational Resources Information Center

    McBee, Matthew

    2010-01-01

    In gifted education research, it is common for outcome variables to exhibit strong floor or ceiling effects due to insufficient range of measurement of many instruments when used with gifted populations. Common statistical methods (e.g., analysis of variance, linear regression) produce biased estimates when such effects are present. In practice,…

  20. Three Methods of Estimating a Model of Group Effects: A Comparison with Reference to School Effect Studies.

    ERIC Educational Resources Information Center

    Igra, Amnon

    1980-01-01

    Three methods of estimating a model of school effects are compared: ordinary least squares; an approach based on the analysis of covariance; and, a residualized input-output approach. Results are presented using a matrix algebra formulation, and advantages of the first two methods are considered. (Author/GK)

  1. An Ounce of Prevention, a Pound of Uncertainty: The Cost-Effectiveness of School-Based Drug Prevention Programs.

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.; Rydell, C. Peter; Everingham, Susan S.; Chiesa, James; Bushway, Shawn

    This book describes an analysis of the cost-effectiveness of model school-based drug prevention programs at reducing cocaine consumption. It compares prevention's cost-effectiveness with that of several enforcement programs and with that of treating heavy cocaine users. It also assesses the cost of nationwide implementation of model prevention…

  2. Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaodong; Vesselinov, Velimir Valentinov

    We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less

  3. Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus

    DOE PAGES

    Zhang, Xiaodong; Vesselinov, Velimir Valentinov

    2016-12-28

    We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less

  4. Application of global sensitivity analysis methods to Takagi-Sugeno-Kang rainfall-runoff fuzzy models

    NASA Astrophysics Data System (ADS)

    Jacquin, A. P.; Shamseldin, A. Y.

    2009-04-01

    This study analyses the sensitivity of the parameters of Takagi-Sugeno-Kang rainfall-runoff fuzzy models previously developed by the authors. These models can be classified in two types, where the first type is intended to account for the effect of changes in catchment wetness and the second type incorporates seasonality as a source of non-linearity in the rainfall-runoff relationship. The sensitivity analysis is performed using two global sensitivity analysis methods, namely Regional Sensitivity Analysis (RSA) and Sobol's Variance Decomposition (SVD). In general, the RSA method has the disadvantage of not being able to detect sensitivities arising from parameter interactions. By contrast, the SVD method is suitable for analysing models where the model response surface is expected to be affected by interactions at a local scale and/or local optima, such as the case of the rainfall-runoff fuzzy models analysed in this study. The data of six catchments from different geographical locations and sizes are used in the sensitivity analysis. The sensitivity of the model parameters is analysed in terms of two measures of goodness of fit, assessing the model performance from different points of view. These measures are the Nash-Sutcliffe criterion and the index of volumetric fit. The results of the study show that the sensitivity of the model parameters depends on both the type of non-linear effects (i.e. changes in catchment wetness or seasonality) that dominates the catchment's rainfall-runoff relationship and the measure used to assess the model performance. Acknowledgements: This research was supported by FONDECYT, Research Grant 11070130. We would also like to express our gratitude to Prof. Kieran M. O'Connor from the National University of Ireland, Galway, for providing the data used in this study.

  5. Resolving the Effects of Maternal and Offspring Genotype on Dyadic Outcomes in Genome Wide Complex Trait Analysis (“M-GCTA”)

    PubMed Central

    Pourcain, Beate St.; Smith, George Davey; York, Timothy P.; Evans, David M.

    2014-01-01

    Genome wide complex trait analysis (GCTA) is extended to include environmental effects of the maternal genotype on offspring phenotype (“maternal effects”, M-GCTA). The model includes parameters for the direct effects of the offspring genotype, maternal effects and the covariance between direct and maternal effects. Analysis of simulated data, conducted in OpenMx, confirmed that model parameters could be recovered by full information maximum likelihood (FIML) and evaluated the biases that arise in conventional GCTA when indirect genetic effects are ignored. Estimates derived from FIML in OpenMx showed very close agreement to those obtained by restricted maximum likelihood using the published algorithm for GCTA. The method was also applied to illustrative perinatal phenotypes from ∼4,000 mother-offspring pairs from the Avon Longitudinal Study of Parents and Children. The relative merits of extended GCTA in contrast to quantitative genetic approaches based on analyzing the phenotypic covariance structure of kinships are considered. PMID:25060210

  6. Comparison of two gas chromatograph models and analysis of binary data

    NASA Technical Reports Server (NTRS)

    Keba, P. S.; Woodrow, P. T.

    1972-01-01

    The overall objective of the gas chromatograph system studies is to generate fundamental design criteria and techniques to be used in the optimum design of the system. The particular tasks currently being undertaken are the comparison of two mathematical models of the chromatograph and the analysis of binary system data. The predictions of two mathematical models, an equilibrium absorption model and a non-equilibrium absorption model exhibit the same weaknesses in their inability to predict chromatogram spreading for certain systems. The analysis of binary data using the equilibrium absorption model confirms that, for the systems considered, superposition of predicted single component behaviors is a first order representation of actual binary data. Composition effects produce non-idealities which limit the rigorous validity of superposition.

  7. Continuum Damage Mechanics Models for the Analysis of Progressive Failure in Open-Hole Tension Laminates

    NASA Technical Reports Server (NTRS)

    Song, Kyonchan; Li, Yingyong; Rose, Cheryl A.

    2011-01-01

    The performance of a state-of-the-art continuum damage mechanics model for interlaminar damage, coupled with a cohesive zone model for delamination is examined for failure prediction of quasi-isotropic open-hole tension laminates. Limitations of continuum representations of intra-ply damage and the effect of mesh orientation on the analysis predictions are discussed. It is shown that accurate prediction of matrix crack paths and stress redistribution after cracking requires a mesh aligned with the fiber orientation. Based on these results, an aligned mesh is proposed for analysis of the open-hole tension specimens consisting of different meshes within the individual plies, such that the element edges are aligned with the ply fiber direction. The modeling approach is assessed by comparison of analysis predictions to experimental data for specimen configurations in which failure is dominated by complex interactions between matrix cracks and delaminations. It is shown that the different failure mechanisms observed in the tests are well predicted. In addition, the modeling approach is demonstrated to predict proper trends in the effect of scaling on strength and failure mechanisms of quasi-isotropic open-hole tension laminates.

  8. Detecting a periodic signal in the terrestrial cratering record

    NASA Technical Reports Server (NTRS)

    Grieve, Richard A. F.; Rupert, James D.; Goodacre, Alan K.; Sharpton, Virgil L.

    1988-01-01

    A time-series analysis of model periodic data, where the period and phase are known, has been performed in order to investigate whether a significant period can be detected consistently from a mix of random and periodic impacts. Special attention is given to the effect of age uncertainties and random ages in the detection of a periodic signal. An equivalent analysis is performed with observed data on crater ages and compared with the model data, and the effects of the temporal distribution of crater ages on the results from the time-series analysis are studied. Evidence for a consistent 30-m.y. period is found to be weak.

  9. Simulation-based power calculations for planning a two-stage individual participant data meta-analysis.

    PubMed

    Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D

    2018-05-18

    Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has < 60% power to detect a reduction of 1 kg weight gain for a 10-unit increase in BMI. Additional IPD from ten other published trials (containing 1761 patients) would improve power to over 80%, but only if a fixed-effect meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power calculations could inform the planning and funding of IPD projects, and should be used routinely.

  10. Sensitivity analysis of the parameters of an HIV/AIDS model with condom campaign and antiretroviral therapy

    NASA Astrophysics Data System (ADS)

    Marsudi, Hidayat, Noor; Wibowo, Ratno Bagus Edy

    2017-12-01

    In this article, we present a deterministic model for the transmission dynamics of HIV/AIDS in which condom campaign and antiretroviral therapy are both important for the disease management. We calculate the effective reproduction number using the next generation matrix method and investigate the local and global stability of the disease-free equilibrium of the model. Sensitivity analysis of the effective reproduction number with respect to the model parameters were carried out. Our result shows that efficacy rate of condom campaign, transmission rate for contact with the asymptomatic infective, progression rate from the asymptomatic infective to the pre-AIDS infective, transmission rate for contact with the pre-AIDS infective, ARV therapy rate, proportion of the susceptible receiving condom campaign and proportion of the pre-AIDS receiving ARV therapy are highly sensitive parameters that effect the transmission dynamics of HIV/AIDS infection.

  11. Uncertainty analysis in seismic tomography

    NASA Astrophysics Data System (ADS)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  12. Bayesian multimodel inference of soil microbial respiration models: Theory, application and future prospective

    NASA Astrophysics Data System (ADS)

    Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.

    2015-12-01

    Models in biogeoscience involve uncertainties in observation data, model inputs, model structure, model processes and modeling scenarios. To accommodate for different sources of uncertainty, multimodal analysis such as model combination, model selection, model elimination or model discrimination are becoming more popular. To illustrate theoretical and practical challenges of multimodal analysis, we use an example about microbial soil respiration modeling. Global soil respiration releases more than ten times more carbon dioxide to the atmosphere than all anthropogenic emissions. Thus, improving our understanding of microbial soil respiration is essential for improving climate change models. This study focuses on a poorly understood phenomena, which is the soil microbial respiration pulses in response to episodic rainfall pulses (the "Birch effect"). We hypothesize that the "Birch effect" is generated by the following three mechanisms. To test our hypothesis, we developed and assessed five evolving microbial-enzyme models against field measurements from a semiarid Savannah that is characterized by pulsed precipitation. These five model evolve step-wise such that the first model includes none of these three mechanism, while the fifth model includes the three mechanisms. The basic component of Bayesian multimodal analysis is the estimation of marginal likelihood to rank the candidate models based on their overall likelihood with respect to observation data. The first part of the study focuses on using this Bayesian scheme to discriminate between these five candidate models. The second part discusses some theoretical and practical challenges, which are mainly the effect of likelihood function selection and the marginal likelihood estimation methods on both model ranking and Bayesian model averaging. The study shows that making valid inference from scientific data is not a trivial task, since we are not only uncertain about the candidate scientific models, but also about the statistical methods that are used to discriminate between these models.

  13. Static analysis of a sonar dome rubber window

    NASA Technical Reports Server (NTRS)

    Lai, J. L.

    1978-01-01

    The application of NASTRAN (level 16.0.1) to the static analysis of a sonar dome rubber window (SDRW) was demonstrated. The assessment of the conventional model (neglecting the enclosed fluid) for the stress analysis of the SDRW was made by comparing its results to those based on a sophisticated model (including the enclosed fluid). The fluid was modeled with isoparametric linear hexahedron elements with approximate material properties whose shear modulus was much smaller than its bulk modulus. The effect of the chosen material property for the fluid is discussed.

  14. Can the Medical-nursing Combined Care Promote the Accessibility of Health Services for the Elderly in Nursing Home? A Study Protocol of Analysis of the Effectiveness Regarding Health Service Utilization, Health Status and Satisfaction with Care.

    PubMed

    Bao, J; Wang, X-J; Yang, Y; Dong, R-Q; Mao, Z-F

    2015-12-01

    Currently, segmentation of healthcare and daily care for the elderly living in nursing homes usually results in the elderly not getting medical treatment timely and effectively. The medical-nursing combined care, which has been put into practice in several areas in China, is developed to enhance the accessibility of healthcare for the elderly. The aim of the study is to explore the effectiveness of the new care service, based on Andersen model, regarding health service utilization, health status and service satisfaction. The effectiveness of medical-nursing combined care will be measured in a cross-sectional study in nine nursing homes in Jianghan District, Wuhan, China, with 1067 old residents expected to participate. The questionnaire containing items of demographics, health service use, service satisfaction and instrument of SF-36 V2 is developed based on the conceptual framework of Andersen behaviour model of health service utilization. Descriptive analysis, variance analysis, multiple factors analysis, and correlation analysis will be performed to compare the sociological characteristics, health service use, health status and service satisfaction of the elderly living in different modes of nursing homes, to explore the influence factors of care effectiveness, as well as to study the relationship between health behaviour and health outcomes. The study design of analysing the effects of medical-nursing combined care and performing the horizontal comparison among the nursing homes under the framework of Andersen model is blazing new trails. Recruitment and design of questionnaire are important issues. Successful data collection and quality control are also necessary. Taking these into account, this study is estimated to provide evidence for the effectiveness of medical-nursing combined care service in China.

  15. Application of digital human modeling and simulation for vision analysis of pilots in a jet aircraft: a case study.

    PubMed

    Karmakar, Sougata; Pal, Madhu Sudan; Majumdar, Deepti; Majumdar, Dhurjati

    2012-01-01

    Ergonomic evaluation of visual demands becomes crucial for the operators/users when rapid decision making is needed under extreme time constraint like navigation task of jet aircraft. Research reported here comprises ergonomic evaluation of pilot's vision in a jet aircraft in virtual environment to demonstrate how vision analysis tools of digital human modeling software can be used effectively for such study. Three (03) dynamic digital pilot models, representative of smallest, average and largest Indian pilot population were generated from anthropometric database and interfaced with digital prototype of the cockpit in Jack software for analysis of vision within and outside the cockpit. Vision analysis tools like view cones, eye view windows, blind spot area, obscuration zone, reflection zone etc. were employed during evaluation of visual fields. Vision analysis tool was also used for studying kinematic changes of pilot's body joints during simulated gazing activity. From present study, it can be concluded that vision analysis tool of digital human modeling software was found very effective in evaluation of position and alignment of different displays and controls in the workstation based upon their priorities within the visual fields and anthropometry of the targeted users, long before the development of its physical prototype.

  16. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes

    PubMed Central

    2014-01-01

    Background Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. Methods The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Results Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on outcomes not directly considered by the studies included in the analysis. Conclusions Accounting for the dependency between outcomes in a multivariate meta-analysis may or may not improve the precision of effect estimates from a network meta-analysis compared to analysing each outcome separately. PMID:25047164

  17. Marginalized zero-inflated Poisson models with missing covariates.

    PubMed

    Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan

    2018-05-11

    Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Cost-Effectiveness of a Model Infection Control Program for Preventing Multi-Drug-Resistant Organism Infections in Critically Ill Surgical Patients.

    PubMed

    Jayaraman, Sudha P; Jiang, Yushan; Resch, Stephen; Askari, Reza; Klompas, Michael

    2016-10-01

    Interventions to contain two multi-drug-resistant Acinetobacter (MDRA) outbreaks reduced the incidence of multi-drug-resistant (MDR) organisms, specifically methicillin-resistant Staphylococcus aureus, vancomycin-resistant Enterococcus, and Clostridium difficile in the general surgery intensive care unit (ICU) of our hospital. We therefore conducted a cost-effective analysis of a proactive model infection-control program to reduce transmission of MDR organisms based on the practices used to control the MDRA outbreak. We created a model of a proactive infection control program based on the 2011 MDRA outbreak response. We built a decision analysis model and performed univariable and probabilistic sensitivity analyses to evaluate the cost-effectiveness of the proposed program compared with standard infection control practices to reduce transmission of these MDR organisms. The cost of a proactive infection control program would be $68,509 per year. The incremental cost-effectiveness ratio (ICER) was calculated to be $3,804 per aversion of transmission of MDR organisms in a one-year period compared with standard infection control. On the basis of probabilistic sensitivity analysis, a willingness-to-pay (WTP) threshold of $14,000 per transmission averted would have a 42% probability of being cost-effective, rising to 100% at $22,000 per transmission averted. This analysis gives an estimated ICER for implementing a proactive program to prevent transmission of MDR organisms in the general surgery ICU. To better understand the causal relations between the critical steps in the program and the rate reductions, a randomized study of a package of interventions to prevent healthcare-associated infections should be considered.

  19. "A Bayesian sensitivity analysis to evaluate the impact of unmeasured confounding with external data: a real world comparative effectiveness study in osteoporosis".

    PubMed

    Zhang, Xiang; Faries, Douglas E; Boytsov, Natalie; Stamey, James D; Seaman, John W

    2016-09-01

    Observational studies are frequently used to assess the effectiveness of medical interventions in routine clinical practice. However, the use of observational data for comparative effectiveness is challenged by selection bias and the potential of unmeasured confounding. This is especially problematic for analyses using a health care administrative database, in which key clinical measures are often not available. This paper provides an approach to conducting a sensitivity analyses to investigate the impact of unmeasured confounding in observational studies. In a real world osteoporosis comparative effectiveness study, the bone mineral density (BMD) score, an important predictor of fracture risk and a factor in the selection of osteoporosis treatments, is unavailable in the data base and lack of baseline BMD could potentially lead to significant selection bias. We implemented Bayesian twin-regression models, which simultaneously model both the observed outcome and the unobserved unmeasured confounder, using information from external sources. A sensitivity analysis was also conducted to assess the robustness of our conclusions to changes in such external data. The use of Bayesian modeling in this study suggests that the lack of baseline BMD did have a strong impact on the analysis, reversing the direction of the estimated effect (odds ratio of fracture incidence at 24 months: 0.40 vs. 1.36, with/without adjusting for unmeasured baseline BMD). The Bayesian twin-regression models provide a flexible sensitivity analysis tool to quantitatively assess the impact of unmeasured confounding in observational studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. A Cross-Cultural Analysis of the Effectiveness of the Learning Organization Model in School Contexts

    ERIC Educational Resources Information Center

    Alavi, Seyyed Babak; McCormick, John

    2004-01-01

    It has been argued that some management theories and models may not be universal and are based on some cultural assumptions. It is argued in this paper that the effectiveness of applying the Learning Organization (LO) model in school contexts across different countries may be associated with cultural differences such as individualism,…

  1. Developing an Effective Plan for Smart Sanctions: A Network Analysis Approach

    DTIC Science & Technology

    2012-10-31

    data and a network model that realistically simulates the Iranian nuclear development program. We then utilize several network analysis techniques...the Iran Watch (iranwatch.org) watchdog website. Using this data, which at first glance seems obtuse and unwieldy, we constructed network models in... model is created, nodes were evaluated using several measures of centrality. The team then analyzed this network utilizing four of the most common

  2. Testing Mediation Using Multiple Regression and Structural Equation Modeling Analyses in Secondary Data

    ERIC Educational Resources Information Center

    Li, Spencer D.

    2011-01-01

    Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…

  3. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  4. Conjoint Analysis: A Study of the Effects of Using Person Variables.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…

  5. The Use of Multiple Regression Models to Determine if Conjoint Analysis Should Be Conducted on Aggregate Data.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    1996-01-01

    In a conjoint-analysis consumer-preference study, researchers must determine whether the product factor estimates, which measure consumer preferences, should be calculated and interpreted for each respondent or collectively. Multiple regression models can determine whether to aggregate data by examining factor-respondent interaction effects. This…

  6. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  7. Video Modeling for Children and Adolescents with Autism Spectrum Disorder: A Meta-Analysis

    ERIC Educational Resources Information Center

    Thompson, Teresa Lynn

    2014-01-01

    The objective of this research was to conduct a meta-analysis to examine existing research studies on video modeling as an effective teaching tool for children and adolescents diagnosed with Autism Spectrum Disorder (ASD). Study eligibility criteria included (a) single case research design using multiple baselines, alternating treatment designs,…

  8. Methods for integrated modeling of landscape change: Interior Northwest Landscape Analysis System.

    Treesearch

    Jane L. Hayes; Alan. A. Ager; R. James Barbour

    2004-01-01

    The Interior Northwest Landscape Analysis System (INLAS) links a number of resource, disturbance, and landscape simulations models to examine the interactions of vegetative succession, management, and disturbance with policy goals. The effects of natural disturbance like wildfire, herbivory, forest insects and diseases, as well as specific management actions are...

  9. Methods for calculating confidence and credible intervals for the residual between-study variance in random effects meta-regression models

    PubMed Central

    2014-01-01

    Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829

  10. Effects of Combined Loads on the Nonlinear Response and Residual Strength of Damaged Stiffened Shells

    NASA Technical Reports Server (NTRS)

    Starnes, James H., Jr.; Rose, Cheryl A.; Rankin, Charles C.

    1996-01-01

    The results of an analytical study of the nonlinear response of stiffened fuselage shells with long cracks are presented. The shells are modeled with a hierarchical modeling strategy and analyzed with a nonlinear shell analysis code that maintains the shell in a nonlinear equilibrium state while the crack is grown. The analysis accurately accounts for global and local structural response phenomena. Results are presented for various combinations of internal pressure and mechanical loads, and the effects of crack orientation on the shell response are described. The effects of combined loading conditions and the effects of varying structural parameters on the stress-intensity factors associated with a crack are presented.

  11. A cross-species analysis method to analyze animal models' similarity to human's disease state

    PubMed Central

    2012-01-01

    Background Animal models are indispensable tools in studying the cause of human diseases and searching for the treatments. The scientific value of an animal model depends on the accurate mimicry of human diseases. The primary goal of the current study was to develop a cross-species method by using the animal models' expression data to evaluate the similarity to human diseases' and assess drug molecules' efficiency in drug research. Therefore, we hoped to reveal that it is feasible and useful to compare gene expression profiles across species in the studies of pathology, toxicology, drug repositioning, and drug action mechanism. Results We developed a cross-species analysis method to analyze animal models' similarity to human diseases and effectiveness in drug research by utilizing the existing animal gene expression data in the public database, and mined some meaningful information to help drug research, such as potential drug candidates, possible drug repositioning, side effects and analysis in pharmacology. New animal models could be evaluated by our method before they are used in drug discovery. We applied the method to several cases of known animal model expression profiles and obtained some useful information to help drug research. We found that trichostatin A and some other HDACs could have very similar response across cell lines and species at gene expression level. Mouse hypoxia model could accurately mimic the human hypoxia, while mouse diabetes drug model might have some limitation. The transgenic mouse of Alzheimer was a useful model and we deeply analyzed the biological mechanisms of some drugs in this case. In addition, all the cases could provide some ideas for drug discovery and drug repositioning. Conclusions We developed a new cross-species gene expression module comparison method to use animal models' expression data to analyse the effectiveness of animal models in drug research. Moreover, through data integration, our method could be applied for drug research, such as potential drug candidates, possible drug repositioning, side effects and information about pharmacology. PMID:23282076

  12. A cross-species analysis method to analyze animal models' similarity to human's disease state.

    PubMed

    Yu, Shuhao; Zheng, Lulu; Li, Yun; Li, Chunyan; Ma, Chenchen; Li, Yixue; Li, Xuan; Hao, Pei

    2012-01-01

    Animal models are indispensable tools in studying the cause of human diseases and searching for the treatments. The scientific value of an animal model depends on the accurate mimicry of human diseases. The primary goal of the current study was to develop a cross-species method by using the animal models' expression data to evaluate the similarity to human diseases' and assess drug molecules' efficiency in drug research. Therefore, we hoped to reveal that it is feasible and useful to compare gene expression profiles across species in the studies of pathology, toxicology, drug repositioning, and drug action mechanism. We developed a cross-species analysis method to analyze animal models' similarity to human diseases and effectiveness in drug research by utilizing the existing animal gene expression data in the public database, and mined some meaningful information to help drug research, such as potential drug candidates, possible drug repositioning, side effects and analysis in pharmacology. New animal models could be evaluated by our method before they are used in drug discovery. We applied the method to several cases of known animal model expression profiles and obtained some useful information to help drug research. We found that trichostatin A and some other HDACs could have very similar response across cell lines and species at gene expression level. Mouse hypoxia model could accurately mimic the human hypoxia, while mouse diabetes drug model might have some limitation. The transgenic mouse of Alzheimer was a useful model and we deeply analyzed the biological mechanisms of some drugs in this case. In addition, all the cases could provide some ideas for drug discovery and drug repositioning. We developed a new cross-species gene expression module comparison method to use animal models' expression data to analyse the effectiveness of animal models in drug research. Moreover, through data integration, our method could be applied for drug research, such as potential drug candidates, possible drug repositioning, side effects and information about pharmacology.

  13. Traffic effects on bird counts on North American Breeding Bird Survey routes

    USGS Publications Warehouse

    Griffith, Emily H.; Sauer, John R.; Royle, J. Andrew

    2010-01-01

    The North American Breeding Bird Survey (BBS) is an annual roadside survey used to estimate population change in >420 species of birds that breed in North America. Roadside sampling has been criticized, in part because traffic noise can interfere with bird counts. Since 1997, data have been collected on the numbers of vehicles that pass during counts at each stop. We assessed the effect of traffic by modeling total vehicles as a covariate of counts in hierarchical Poisson regression models used to estimate population change. We selected species for analysis that represent birds detected at low and high abundance and birds with songs of low and high frequencies. Increases in vehicle counts were associated with decreases in bird counts in most of the species examined. The size and direction of these effects remained relatively constant between two alternative models that we analyzed. Although this analysis indicated only a small effect of incorporating traffic effects when modeling roadside counts of birds, we suggest that continued evaluation of changes in traffic at BBS stops should be a component of future BBS analyses.

  14. The value of a statistical life: a meta-analysis with a mixed effects regression model.

    PubMed

    Bellavance, François; Dionne, Georges; Lebeau, Martin

    2009-03-01

    The value of a statistical life (VSL) is a very controversial topic, but one which is essential to the optimization of governmental decisions. We see a great variability in the values obtained from different studies. The source of this variability needs to be understood, in order to offer public decision-makers better guidance in choosing a value and to set clearer guidelines for future research on the topic. This article presents a meta-analysis based on 39 observations obtained from 37 studies (from nine different countries) which all use a hedonic wage method to calculate the VSL. Our meta-analysis is innovative in that it is the first to use the mixed effects regression model [Raudenbush, S.W., 1994. Random effects models. In: Cooper, H., Hedges, L.V. (Eds.), The Handbook of Research Synthesis. Russel Sage Foundation, New York] to analyze studies on the value of a statistical life. We conclude that the variability found in the values studied stems in large part from differences in methodologies.

  15. Evaluating disease management program effectiveness: an introduction to survival analysis.

    PubMed

    Linden, Ariel; Adams, John L; Roberts, Nancy

    2004-01-01

    Currently, the most widely used method in the disease management industry for evaluating program effectiveness is the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer plausible rationale explaining the change from baseline. Survival analysis allows for the inclusion of data from censored cases, those subjects who either "survived" the program without experiencing the event (e.g., achievement of target clinical levels, hospitalization) or left the program prematurely, due to disenrollement from the health plan or program, or were lost to follow-up. Additionally, independent variables may be included in the model to help explain the variability in the outcome measure. In order to maximize the potential of this statistical method, validity of the model and research design must be assured. This paper reviews survival analysis as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.

  16. Use of multicriteria decision analysis for assessing the benefit and risk of over-the-counter analgesics.

    PubMed

    Moore, Andrew; Crossley, Anne; Ng, Bernard; Phillips, Lawrence; Sancak, Özgür; Rainsford, K D

    2017-10-01

    To test the ability of a multicriteria decision analysis (MCDA) model to incorporate disparate data sources of varying quality along with clinical judgement in a benefit-risk assessment of six well-known pain-relief drugs. Six over-the-counter (OTC) analgesics were evaluated against three favourable effects and eight unfavourable effects by seven experts who specialise in the relief of pain, two in a 2-day facilitated workshop whose input data and judgements were later peer-reviewed by five additional experts. Ibuprofen salts and solubilised emerged with the best benefit-risk profile, followed by naproxen, ibuprofen acid, diclofenac, paracetamol and aspirin. Multicriteria decision analysis enabled participants to evaluate the OTC analgesics against a range of favourable and unfavourable effects in a group setting that enabled all issues to be openly aired and debated. The model was easily communicated and understood by the peer reviewers, so the model should be comprehensible to physicians, pharmacists and other health professionals. © 2017 Royal Pharmaceutical Society.

  17. Disaster Reintegration Model: A Qualitative Analysis on Developing Korean Disaster Mental Health Support Model

    PubMed Central

    O’Donnell, Meaghan

    2018-01-01

    This study sought to describe the mental health problems experienced by Korean disaster survivors, using a qualitative research method to provide empirical resources for effective disaster mental health support in Korea. Participants were 16 adults or elderly adults who experienced one or more disasters at least 12 months ago recruited via theoretical sampling. Participants underwent in-depth individual interviews on their disaster experiences, which were recorded and transcribed for qualitative analysis, which followed Strauss and Corbin’s (1998) Grounded theory. After open coding, participants’ experiences were categorized into 130 codes, 43 sub-categories and 17 categories. The categories were further analyzed in a paradigm model, conditional model and the Disaster Reintegration Model, which proposed potentially effective mental health recovery strategies for disaster survivors, health providers and administrators. To provide effective assistance for mental health recovery of disaster survivors, both personal and public resilience should be promoted while considering both cultural and spiritual elements. PMID:29463030

  18. Computing Linear Mathematical Models Of Aircraft

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.

    1991-01-01

    Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.

  19. Cost-effectiveness analysis of population-based tobacco control strategies in the prevention of cardiovascular diseases in Tanzania.

    PubMed

    Ngalesoni, Frida; Ruhago, George; Mayige, Mary; Oliveira, Tiago Cravo; Robberstad, Bjarne; Norheim, Ole Frithjof; Higashi, Hideki

    2017-01-01

    Tobacco consumption contributes significantly to the global burden of disease. The prevalence of smoking is estimated to be increasing in many low-income countries, including Tanzania, especially among women and youth. Even so, the implementation of tobacco control measures has been discouraging in the country. Efforts to foster investment in tobacco control are hindered by lack of evidence on what works and at what cost. We aim to estimate the cost and cost-effectiveness of population-based tobacco control strategies in the prevention of cardiovascular diseases (CVD) in Tanzania. A cost-effectiveness analysis was performed using an Excel-based Markov model, from a governmental perspective. We employed an ingredient approach and step-down methodologies in the costing exercise following a government perspective. Epidemiological data and efficacy inputs were derived from the literature. We used disability-adjusted life years (DALYs) averted as the outcome measure. A probabilistic sensitivity analysis was carried out with Ersatz to incorporate uncertainties in the model parameters. Our model results showed that all five tobacco control strategies were very cost-effective since they fell below the ceiling ratio of one GDP per capita suggested by the WHO. Increase in tobacco taxes was the most cost-effective strategy, while a workplace smoking ban was the least cost-effective option, with a cost-effectiveness ratio of US$5 and US$267, respectively. Even though all five interventions are deemed very cost-effective in the prevention of CVD in Tanzania, more research on budget impact analysis is required to further assess the government's ability to implement these interventions.

  20. Distribution of lod scores in oligogenic linkage analysis.

    PubMed

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  1. A Nonlinear Model for Gene-Based Gene-Environment Interaction.

    PubMed

    Sa, Jian; Liu, Xu; He, Tao; Liu, Guifen; Cui, Yuehua

    2016-06-04

    A vast amount of literature has confirmed the role of gene-environment (G×E) interaction in the etiology of complex human diseases. Traditional methods are predominantly focused on the analysis of interaction between a single nucleotide polymorphism (SNP) and an environmental variable. Given that genes are the functional units, it is crucial to understand how gene effects (rather than single SNP effects) are influenced by an environmental variable to affect disease risk. Motivated by the increasing awareness of the power of gene-based association analysis over single variant based approach, in this work, we proposed a sparse principle component regression (sPCR) model to understand the gene-based G×E interaction effect on complex disease. We first extracted the sparse principal components for SNPs in a gene, then the effect of each principal component was modeled by a varying-coefficient (VC) model. The model can jointly model variants in a gene in which their effects are nonlinearly influenced by an environmental variable. In addition, the varying-coefficient sPCR (VC-sPCR) model has nice interpretation property since the sparsity on the principal component loadings can tell the relative importance of the corresponding SNPs in each component. We applied our method to a human birth weight dataset in Thai population. We analyzed 12,005 genes across 22 chromosomes and found one significant interaction effect using the Bonferroni correction method and one suggestive interaction. The model performance was further evaluated through simulation studies. Our model provides a system approach to evaluate gene-based G×E interaction.

  2. Bayesian inference for multivariate meta-analysis Box-Cox transformation models for individual patient data with applications to evaluation of cholesterol lowering drugs

    PubMed Central

    Kim, Sungduk; Chen, Ming-Hui; Ibrahim, Joseph G.; Shah, Arvind K.; Lin, Jianxin

    2013-01-01

    In this paper, we propose a class of Box-Cox transformation regression models with multidimensional random effects for analyzing multivariate responses for individual patient data (IPD) in meta-analysis. Our modeling formulation uses a multivariate normal response meta-analysis model with multivariate random effects, in which each response is allowed to have its own Box-Cox transformation. Prior distributions are specified for the Box-Cox transformation parameters as well as the regression coefficients in this complex model, and the Deviance Information Criterion (DIC) is used to select the best transformation model. Since the model is quite complex, a novel Monte Carlo Markov chain (MCMC) sampling scheme is developed to sample from the joint posterior of the parameters. This model is motivated by a very rich dataset comprising 26 clinical trials involving cholesterol lowering drugs where the goal is to jointly model the three dimensional response consisting of Low Density Lipoprotein Cholesterol (LDL-C), High Density Lipoprotein Cholesterol (HDL-C), and Triglycerides (TG) (LDL-C, HDL-C, TG). Since the joint distribution of (LDL-C, HDL-C, TG) is not multivariate normal and in fact quite skewed, a Box-Cox transformation is needed to achieve normality. In the clinical literature, these three variables are usually analyzed univariately: however, a multivariate approach would be more appropriate since these variables are correlated with each other. A detailed analysis of these data is carried out using the proposed methodology. PMID:23580436

  3. Bayesian inference for multivariate meta-analysis Box-Cox transformation models for individual patient data with applications to evaluation of cholesterol-lowering drugs.

    PubMed

    Kim, Sungduk; Chen, Ming-Hui; Ibrahim, Joseph G; Shah, Arvind K; Lin, Jianxin

    2013-10-15

    In this paper, we propose a class of Box-Cox transformation regression models with multidimensional random effects for analyzing multivariate responses for individual patient data in meta-analysis. Our modeling formulation uses a multivariate normal response meta-analysis model with multivariate random effects, in which each response is allowed to have its own Box-Cox transformation. Prior distributions are specified for the Box-Cox transformation parameters as well as the regression coefficients in this complex model, and the deviance information criterion is used to select the best transformation model. Because the model is quite complex, we develop a novel Monte Carlo Markov chain sampling scheme to sample from the joint posterior of the parameters. This model is motivated by a very rich dataset comprising 26 clinical trials involving cholesterol-lowering drugs where the goal is to jointly model the three-dimensional response consisting of low density lipoprotein cholesterol (LDL-C), high density lipoprotein cholesterol (HDL-C), and triglycerides (TG) (LDL-C, HDL-C, TG). Because the joint distribution of (LDL-C, HDL-C, TG) is not multivariate normal and in fact quite skewed, a Box-Cox transformation is needed to achieve normality. In the clinical literature, these three variables are usually analyzed univariately; however, a multivariate approach would be more appropriate because these variables are correlated with each other. We carry out a detailed analysis of these data by using the proposed methodology. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, J.S.; Abrahmson, S.; Bender, M.A.

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, modelsmore » are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.« less

  5. Screen or not to screen for peripheral arterial disease: guidance from a decision model.

    PubMed

    Vaidya, Anil; Joore, Manuela A; Ten Cate-Hoek, Arina J; Ten Cate, Hugo; Severens, Johan L

    2014-01-29

    Asymptomatic Peripheral Arterial Disease (PAD) is associated with greater risk of acute cardiovascular events. This study aims to determine the cost-effectiveness of one time only PAD screening using Ankle Brachial Index (ABI) test and subsequent anti platelet preventive treatment (low dose aspirin or clopidogrel) in individuals at high risk for acute cardiovascular events compared to no screening and no treatment using decision analytic modelling. A probabilistic Markov model was developed to evaluate the life time cost-effectiveness of the strategy of selective PAD screening and consequent preventive treatment compared to no screening and no preventive treatment. The analysis was conducted from the Dutch societal perspective and to address decision uncertainty, probabilistic sensitivity analysis was performed. Results were based on average values of 1000 Monte Carlo simulations and using discount rates of 1.5% and 4% for effects and costs respectively. One way sensitivity analyses were performed to identify the two most influential model parameters affecting model outputs. Then, a two way sensitivity analysis was conducted for combinations of values tested for these two most influential parameters. For the PAD screening strategy, life years and quality adjusted life years gained were 21.79 and 15.66 respectively at a lifetime cost of 26,548 Euros. Compared to no screening and treatment (20.69 life years, 15.58 Quality Adjusted Life Years, 28,052 Euros), these results indicate that PAD screening and treatment is a dominant strategy. The cost effectiveness acceptability curves show 88% probability of PAD screening being cost effective at the Willingness To Pay (WTP) threshold of 40000 Euros. In a scenario analysis using clopidogrel as an alternative anti-platelet drug, PAD screening strategy remained dominant. This decision analysis suggests that targeted ABI screening and consequent secondary prevention of cardiovascular events using low dose aspirin or clopidogrel in the identified patients is a cost-effective strategy. Implementation of targeted PAD screening and subsequent treatment in primary care practices and in public health programs is likely to improve the societal health and to save health care costs by reducing catastrophic cardiovascular events.

  6. A statistical analysis of seat belt effectiveness in 1973-1975 model cars involved in towaway crashes. Volume 1

    DOT National Transportation Integrated Search

    1976-09-01

    Standardized injury rates and seat belt effectiveness measures are derived from a probability sample of towaway accidents involving 1973-1975 model cars. The data were collected in five different geographic regions. Weighted sample size available for...

  7. Diagnostics of Robust Growth Curve Modeling Using Student's "t" Distribution

    ERIC Educational Resources Information Center

    Tong, Xin; Zhang, Zhiyong

    2012-01-01

    Growth curve models with different types of distributions of random effects and of intraindividual measurement errors for robust analysis are compared. After demonstrating the influence of distribution specification on parameter estimation, 3 methods for diagnosing the distributions for both random effects and intraindividual measurement errors…

  8. EFFECTS ON ELECTROSTATIC PRECIPITATION OF CHANGES IN GRAIN LOADING, SIZE DISTRIBUTION, RESISTIVITY, AND TEMPERATURE

    EPA Science Inventory

    The paper discusses the simulation of the effects of changes to particle loading, particle size distribution, and electrostatic precipitator (ESP) operating temperatures using ESP models. It also illustrates the usefulness of modern ESP models for this type of analysis. Increasin...

  9. Assessment of Poisson, probit and linear models for genetic analysis of presence and number of black spots in Corriedale sheep.

    PubMed

    Peñagaricano, F; Urioste, J I; Naya, H; de los Campos, G; Gianola, D

    2011-04-01

    Black skin spots are associated with pigmented fibres in wool, an important quality fault. Our objective was to assess alternative models for genetic analysis of presence (BINBS) and number (NUMBS) of black spots in Corriedale sheep. During 2002-08, 5624 records from 2839 animals in two flocks, aged 1 through 6 years, were taken at shearing. Four models were considered: linear and probit for BINBS and linear and Poisson for NUMBS. All models included flock-year and age as fixed effects and animal and permanent environmental as random effects. Models were fitted to the whole data set and were also compared based on their predictive ability in cross-validation. Estimates of heritability ranged from 0.154 to 0.230 for BINBS and 0.269 to 0.474 for NUMBS. For BINBS, the probit model fitted slightly better to the data than the linear model. Predictions of random effects from these models were highly correlated, and both models exhibited similar predictive ability. For NUMBS, the Poisson model, with a residual term to account for overdispersion, performed better than the linear model in goodness of fit and predictive ability. Predictions of random effects from the Poisson model were more strongly correlated with those from BINBS models than those from the linear model. Overall, the use of probit or linear models for BINBS and of a Poisson model with a residual for NUMBS seems a reasonable choice for genetic selection purposes in Corriedale sheep. © 2010 Blackwell Verlag GmbH.

  10. Real longitudinal data analysis for real people: building a good enough mixed model.

    PubMed

    Cheng, Jing; Edwards, Lloyd J; Maldonado-Molina, Mildred M; Komro, Kelli A; Muller, Keith E

    2010-02-20

    Mixed effects models have become very popular, especially for the analysis of longitudinal data. One challenge is how to build a good enough mixed effects model. In this paper, we suggest a systematic strategy for addressing this challenge and introduce easily implemented practical advice to build mixed effects models. A general discussion of the scientific strategies motivates the recommended five-step procedure for model fitting. The need to model both the mean structure (the fixed effects) and the covariance structure (the random effects and residual error) creates the fundamental flexibility and complexity. Some very practical recommendations help to conquer the complexity. Centering, scaling, and full-rank coding of all the predictor variables radically improve the chances of convergence, computing speed, and numerical accuracy. Applying computational and assumption diagnostics from univariate linear models to mixed model data greatly helps to detect and solve the related computational problems. Applying computational and assumption diagnostics from the univariate linear models to the mixed model data can radically improve the chances of convergence, computing speed, and numerical accuracy. The approach helps to fit more general covariance models, a crucial step in selecting a credible covariance model needed for defensible inference. A detailed demonstration of the recommended strategy is based on data from a published study of a randomized trial of a multicomponent intervention to prevent young adolescents' alcohol use. The discussion highlights a need for additional covariance and inference tools for mixed models. The discussion also highlights the need for improving how scientists and statisticians teach and review the process of finding a good enough mixed model. (c) 2009 John Wiley & Sons, Ltd.

  11. Extending existing structural identifiability analysis methods to mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2018-01-01

    The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Landsat analysis of tropical forest succession employing a terrain model

    NASA Technical Reports Server (NTRS)

    Barringer, T. H.; Robinson, V. B.; Coiner, J. C.; Bruce, R. C.

    1980-01-01

    Landsat multispectral scanner (MSS) data have yielded a dual classification of rain forest and shadow in an analysis of a semi-deciduous forest on Mindonoro Island, Philippines. Both a spatial terrain model, using a fifth side polynomial trend surface analysis for quantitatively estimating the general spatial variation in the data set, and a spectral terrain model, based on the MSS data, have been set up. A discriminant analysis, using both sets of data, has suggested that shadowing effects may be due primarily to local variations in the spectral regions and can therefore be compensated for through the decomposition of the spatial variation in both elevation and MSS data.

  13. Eliciting mixed emotions: a meta-analysis comparing models, types, and measures.

    PubMed

    Berrios, Raul; Totterdell, Peter; Kellett, Stephen

    2015-01-01

    The idea that people can experience two oppositely valenced emotions has been controversial ever since early attempts to investigate the construct of mixed emotions. This meta-analysis examined the robustness with which mixed emotions have been elicited experimentally. A systematic literature search identified 63 experimental studies that instigated the experience of mixed emotions. Studies were distinguished according to the structure of the underlying affect model-dimensional or discrete-as well as according to the type of mixed emotions studied (e.g., happy-sad, fearful-happy, positive-negative). The meta-analysis using a random-effects model revealed a moderate to high effect size for the elicitation of mixed emotions (d IG+ = 0.77), which remained consistent regardless of the structure of the affect model, and across different types of mixed emotions. Several methodological and design moderators were tested. Studies using the minimum index (i.e., the minimum value between a pair of opposite valenced affects) resulted in smaller effect sizes, whereas subjective measures of mixed emotions increased the effect sizes. The presence of more women in the samples was also associated with larger effect sizes. The current study indicates that mixed emotions are a robust, measurable and non-artifactual experience. The results are discussed in terms of the implications for an affect system that has greater versatility and flexibility than previously thought.

  14. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  15. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  16. Attenuation Factors for B(E2) in the Microscopic Description of Multiphonon States ---A Simple Model Analysis---

    NASA Astrophysics Data System (ADS)

    Matsuyanagi, K.

    1982-05-01

    With an exactly solvable O(4) model of Piepenbring, Silvestre-Brac and Szymanski, we demonstrate that the attenuation factor for the B(E2) values, derived by the lowest-order approximation of the multiphonon method, takes excellent care of the kinematical anharmonicity effects, if multiphonon states are defined in the intrinsic subspace orthogonal to the pairing rotation. It is also shown that the other attenuation effect characterizing the interacting boson model is not a dominant effect in the model analysed here.

  17. A health hierarchy of effects model: a synthesis of advertising and health hierarchy conceptualizations.

    PubMed

    Rouse, R A

    1991-01-01

    Work by both advertising and health researchers has independently yielded hierarchy of effects models which can be used to predict campaign success. Unfortunately, however, previous work has been criticized as "common sense" approaches which are more "assumed" than "proven." This analysis argues that much of the problem is due to the lack of precision often associated with over-simplified "uni-dimensional" models. Instead, this perspective synthesized a "two-dimensional" health hierarchy of effects model and outlines a pragmatic strategy for campaign measurement.

  18. Quantifying the impact of between-study heterogeneity in multivariate meta-analyses

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2012-01-01

    Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950

  19. Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling

    PubMed Central

    Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola

    2017-01-01

    Objective To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. Methods A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. Results In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. Conclusion In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. PMID:28601866

  20. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

Top