Models for evaluating the performability of degradable computing systems
NASA Technical Reports Server (NTRS)
Wu, L. T.
1982-01-01
Recent advances in multiprocessor technology established the need for unified methods to evaluate computing systems performance and reliability. In response to this modeling need, a general modeling framework that permits the modeling, analysis and evaluation of degradable computing systems is considered. Within this framework, several user oriented performance variables are identified and shown to be proper generalizations of the traditional notions of system performance and reliability. Furthermore, a time varying version of the model is developed to generalize the traditional fault tree reliability evaluation methods of phased missions.
A General Linear Model (GLM) was used to evaluate the deviation of predicted values from expected values for a complex environmental model. For this demonstration, we used the default level interface of the Regional Mercury Cycling Model (R-MCM) to simulate epilimnetic total mer...
Evaluation of airborne lidar data to predict vegetation Presence/Absence
Palaseanu-Lovejoy, M.; Nayegandhi, A.; Brock, J.; Woodman, R.; Wright, C.W.
2009-01-01
This study evaluates the capabilities of the Experimental Advanced Airborne Research Lidar (EAARL) in delineating vegetation assemblages in Jean Lafitte National Park, Louisiana. Five-meter-resolution grids of bare earth, canopy height, canopy-reflection ratio, and height of median energy were derived from EAARL data acquired in September 2006. Ground-truth data were collected along transects to assess species composition, canopy cover, and ground cover. To decide which model is more accurate, comparisons of general linear models and generalized additive models were conducted using conventional evaluation methods (i.e., sensitivity, specificity, Kappa statistics, and area under the curve) and two new indexes, net reclassification improvement and integrated discrimination improvement. Generalized additive models were superior to general linear models in modeling presence/absence in training vegetation categories, but no statistically significant differences between the two models were achieved in determining the classification accuracy at validation locations using conventional evaluation methods, although statistically significant improvements in net reclassifications were observed. ?? 2009 Coastal Education and Research Foundation.
Presenting an evaluation model of the trauma registry software.
Asadi, Farkhondeh; Paydar, Somayeh
2018-04-01
Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software and sub criteria related to each main criteria separately. This model was validated by experts in this field. Therefore, this model can be used as a comprehensive model and a standard evaluation tool for measuring efficiency and effectiveness and performance improvement of trauma registry software. Copyright © 2018 Elsevier B.V. All rights reserved.
Evaluation and Sensitivity Analysis of an Ocean Model Response to Hurricane Ivan (PREPRINT)
2009-05-18
analysis of upper-limb meridional overturning circulation interior ocean pathways in the tropical/subtropical Atlantic . In: Interhemispheric Water...diminishing returns are encountered when either resolution is increased. 3 1. Introduction Coupled ocean-atmosphere general circulation models have become...northwest Caribbean Sea 4 and GOM. Evaluation is difficult because ocean general circulation models incorporate a large suite of numerical algorithms
Performability modeling with continuous accomplishment sets
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1979-01-01
A general modeling framework that permits the definition, formulation, and evaluation of performability is described. It is shown that performability relates directly to system effectiveness, and is a proper generalization of both performance and reliability. A hierarchical modeling scheme is used to formulate the capability function used to evaluate performability. The case in which performance variables take values in a continuous accomplishment set is treated explicitly.
General background on modeling and specifics of modeling vapor intrusion are given. Three classical model applications are described and related to the problem of petroleum vapor intrusion. These indicate the need for model calibration and uncertainty analysis. Evaluation of Bi...
Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers
ERIC Educational Resources Information Center
Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph
2015-01-01
In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…
Multilevel Evaluation Alignment: An Explication of a Four-Step Model
ERIC Educational Resources Information Center
Yang, Huilan; Shen, Jianping; Cao, Honggao; Warfield, Charles
2004-01-01
Using the evaluation work on the W.K. Kellogg Foundation's Unleashing Resources Initiative as an example, in this article we explicate a general four-step model appropriate for multilevel evaluation alignment. We review the relevant literature, argue for the need for evaluation alignment in a multilevel context, explain the four-step model,…
Judge, Timothy A; Hurst, Charlice; Simon, Lauren S
2009-05-01
The authors investigated core self-evaluations and educational attainment as mediating mechanisms for the influence of appearance (physical attractiveness) and intelligence (general mental ability) on income and financial strain. The direct effects of core self-evaluations on financial strain, as well as the indirect effects through income, were also considered. Longitudinal data were obtained as part of a national study, the Harvard Study of Health and Life Quality, and proposed models were evaluated with structural equation modeling. Results supported a partially mediated model, such that general mental ability and physical attractiveness exhibited both direct and indirect effects on income, as mediated by educational attainment and core self-evaluations. Finally, income negatively predicted financial strain, whereas core self-evaluations had both a direct and an indirect (through income) negative effect on financial strain. Overall, the results suggest that looks (physical attractiveness), brains (intelligence), and personality (core self-evaluations) are all important to income and financial strain. (c) 2009 APA, all rights reserved.
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...
A Survey of Model Evaluation Approaches with a Tutorial on Hierarchical Bayesian Methods
ERIC Educational Resources Information Center
Shiffrin, Richard M.; Lee, Michael D.; Kim, Woojae; Wagenmakers, Eric-Jan
2008-01-01
This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues…
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
1991-01-01
A number of topics related to building a generalized distributed system model are discussed. The effects of distributed database modeling on evaluation of transaction rollbacks, the measurement of effects of distributed database models on transaction availability measures, and a performance analysis of static locking in replicated distributed database systems are covered.
Presenting an Evaluation Model for the Cancer Registry Software.
Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh
2017-12-01
As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.
Validation Test Report for GDEM4
2010-08-19
standard deviations called the Generalized Digital Environmental Model ( GDEM ). The present document describes the development and evaluation of GDEM4...the newest version of GDEM . As part of the evaluation of GDEM4, comparisons are made in this report to GDEM3 and to four other ocean climatologies...depth climatology of temperature and salinity and their standard deviations called the Generalized Digital Environmental Model ( GDEM ). The history of
A blended supervision model in Australian general practice training.
Ingham, Gerard; Fry, Jennifer
2016-05-01
The Royal Australian College of General Practitioners' Standards for general practice training allow different models of registrar supervision, provided these models achieve the outcomes of facilitating registrars' learning and ensuring patient safety. In this article, we describe a model of supervision called 'blended supervision', and its initial implementation and evaluation. The blended supervision model integrates offsite supervision with available local supervision resources. It is a pragmatic alternative to traditional supervision. Further evaluation of the cost-effectiveness, safety and effectiveness of this model is required, as is the recruitment and training of remote supervisors. A framework of questions was developed to outline the training practice's supervision methods and explain how blended supervision is achieving supervision and teaching outcomes. The supervision and teaching framework can be used to understand the supervision methods of all practices, not just practices using blended supervision.
A Model of International Communication Media Appraisal: Phase IV, Generalizing the Model to Film.
ERIC Educational Resources Information Center
Johnson, J. David
A study tested a causal model of international communication media appraisal using audience evaluations of tests of two films conducted in the Philippines. It was the fourth in a series of tests of the model in both developed and developing countries. In general the model posited determinative relationships between three exogenous variables…
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four report volumes. Moreover, the tests are generally applicable to other model evaluation problem...
Evaluating a Computational Model of Social Causality and Responsibility
2006-01-01
Evaluating a Computational Model of Social Causality and Responsibility Wenji Mao University of Southern California Institute for Creative...empirically evaluate a computa- tional model of social causality and responsibility against human social judgments. Results from our experimental...developed a general computational model of social cau- sality and responsibility [10, 11] that formalizes the factors people use in reasoning about
Assessing the Evaluative Content of Personality Questionnaires Using Bifactor Models.
Biderman, Michael D; McAbee, Samuel T; Job Chen, Zhuo; Hendy, Nhung T
2018-01-01
Exploratory bifactor models with keying factors were applied to item response data for the NEO-FFI-3 and HEXACO-PI-R questionnaires. Loadings on a general factor and positive and negative keying factors correlated with independent estimates of item valence, suggesting that item valence influences responses to these questionnaires. Correlations between personality domain scores and measures of self-esteem, depression, and positive and negative affect were all reduced significantly when the influence of evaluative content represented by the general and keying factors was removed. Findings support the need to model personality inventories in ways that capture reactions to evaluative item content.
Large deviation approach to the generalized random energy model
NASA Astrophysics Data System (ADS)
Dorlas, T. C.; Dukes, W. M. B.
2002-05-01
The generalized random energy model is a generalization of the random energy model introduced by Derrida to mimic the ultrametric structure of the Parisi solution of the Sherrington-Kirkpatrick model of a spin glass. It was solved exactly in two special cases by Derrida and Gardner. A complete solution for the thermodynamics in the general case was given by Capocaccia et al. Here we use large deviation theory to analyse the model in a very straightforward way. We also show that the variational expression for the free energy can be evaluated easily using the Cauchy-Schwarz inequality.
Watershed scale response to climate change--Trout Lake Basin, Wisconsin
Walker, John F.; Hunt, Randall J.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Trout River Basin at Trout Lake in northern Wisconsin.
Watershed scale response to climate change--Clear Creek Basin, Iowa
Christiansen, Daniel E.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Clear Creek Basin, near Coralville, Iowa.
Watershed scale response to climate change--Feather River Basin, California
Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Feather River Basin, California.
Watershed scale response to climate change--South Fork Flathead River Basin, Montana
Chase, Katherine J.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the South Fork Flathead River Basin, Montana.
Watershed scale response to climate change--Cathance Stream Basin, Maine
Dudley, Robert W.; Hay, Lauren E.; Markstrom, Steven L.; Hodgkins, Glenn A.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Cathance Stream Basin, Maine.
Watershed scale response to climate change--Pomperaug River Watershed, Connecticut
Bjerklie, David M.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Pomperaug River Basin at Southbury, Connecticut.
Watershed scale response to climate change--Starkweather Coulee Basin, North Dakota
Vining, Kevin C.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Starkweather Coulee Basin near Webster, North Dakota.
Watershed scale response to climate change--Sagehen Creek Basin, California
Markstrom, Steven L.; Hay, Lauren E.; Regan, R. Steven
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Sagehen Creek Basin near Truckee, California.
Watershed scale response to climate change--Sprague River Basin, Oregon
Risley, John; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Sprague River Basin near Chiloquin, Oregon.
Watershed scale response to climate change--Black Earth Creek Basin, Wisconsin
Hunt, Randall J.; Walker, John F.; Westenbroek, Steven M.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Black Earth Creek Basin, Wisconsin.
Watershed scale response to climate change--East River Basin, Colorado
Battaglin, William A.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the East River Basin, Colorado.
Watershed scale response to climate change--Naches River Basin, Washington
Mastin, Mark C.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Naches River Basin below Tieton River in Washington.
Watershed scale response to climate change--Flint River Basin, Georgia
Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Flint River Basin at Montezuma, Georgia.
A Moderate Constructivist E-Learning Instructional Model Evaluated on Computer Specialists
ERIC Educational Resources Information Center
Alonso, Fernando; Manrique, Daniel; Vines, Jose M.
2009-01-01
This paper presents a novel instructional model for e-learning and an evaluation study to determine the effectiveness of this model for teaching Java language programming to information technology specialists working for the Spanish Public Administration. This is a general-purpose model that combines objectivist and constructivist learning…
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...
Evaluating Social Causality and Responsibility Models: An Initial Report
2005-01-01
ICT Technical Report ICT-TR-03-2005 Evaluating Social Causality and Responsibility ... social intelligent agents. In this report, we present a general computational model of social causality and responsibility , and empirical results of...2005 to 00-00-2005 4. TITLE AND SUBTITLE Evaluating Social Causality and Responsibility Models: An Initial Report 5a. CONTRACT NUMBER 5b. GRANT
A Rationale for Participant Evaluation
ERIC Educational Resources Information Center
Boody, Robert M.
2009-01-01
There are many different models or approaches to doing program evaluation. Fitzpatrick, Sanders, and Worthen classify them into five general approaches: (a) objectives oriented, (b) management oriented, (c) consumer oriented, (d) expertise oriented, and (e) participant oriented. Within each of these general categories, of course, reside many…
Goodness of Model-Data Fit and Invariant Measurement
ERIC Educational Resources Information Center
Engelhard, George, Jr.; Perkins, Aminah
2013-01-01
In this commentary, Englehard and Perkins remark that Maydeu-Olivares has presented a framework for evaluating the goodness of model-data fit for item response theory (IRT) models and correctly points out that overall goodness-of-fit evaluations of IRT models and data are not generally explored within most applications in educational and…
Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc
2018-05-01
Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.
ESEA Title I Evaluation and Reporting System: User's Guide.
ERIC Educational Resources Information Center
Tallmadge, G. Kasten; Wood, Christine T.
This guidebook concentrates primarily on describing the impact-assessment component of the Elementary and Secondary Education Act (ESEA) Title I evaluation and reporting system for users of the system. Three general evaluation models are presented, along with implementation information for each. The first model, a norm-referenced design, may be…
An Information Search Model of Evaluative Concerns in Intergroup Interaction
ERIC Educational Resources Information Center
Vorauer, Jacquie D.
2006-01-01
In an information search model, evaluative concerns during intergroup interaction are conceptualized as a joint function of uncertainty regarding and importance attached to out-group members' views of oneself. High uncertainty generally fosters evaluative concerns during intergroup exchanges. Importance depends on whether out-group members'…
NASA Technical Reports Server (NTRS)
Al-Jaar, Robert Y.; Desrochers, Alan A.
1989-01-01
The main objective of this research is to develop a generic modeling methodology with a flexible and modular framework to aid in the design and performance evaluation of integrated manufacturing systems using a unified model. After a thorough examination of the available modeling methods, the Petri Net approach was adopted. The concurrent and asynchronous nature of manufacturing systems are easily captured by Petri Net models. Three basic modules were developed: machine, buffer, and Decision Making Unit. The machine and buffer modules are used for modeling transfer lines and production networks. The Decision Making Unit models the functions of a computer node in a complex Decision Making Unit Architecture. The underlying model is a Generalized Stochastic Petri Net (GSPN) that can be used for performance evaluation and structural analysis. GSPN's were chosen because they help manage the complexity of modeling large manufacturing systems. There is no need to enumerate all the possible states of the Markov Chain since they are automatically generated from the GSPN model.
You are such a bad child! Appraisals as mechanisms of parental negative and positive affect.
Gavita, Oana Alexandra; David, Daniel; DiGiuseppe, Raymond
2014-01-01
Although parent cognitions are considered important predictors that determine specific emotional reactions and parental practices, models on the cognitive strategies for regulating parental distress or positive emotions are not well developed. Our aim was to investigate the nature of cognitions involved in parental distress and satisfaction, in terms of their specificity (parental or general) and their processing levels (inferential or evaluative cognitions). We hypothesized that parent's specific evaluative cognitions will mediate the impact of more general and inferential cognitive structures on their affective reactions. We used bootstrapping procedures in order to test the mediation models proposed. Results obtained show indeed that rather specific evaluative parental cognitions are mediating the relationship between general cognitions and parental distress. In terms of the cognitive processing levels, it seems that when parents hold both low self-efficacy and parental negative global evaluations for the self/child, this adds significantly to their distress.
ERIC Educational Resources Information Center
Brethower, Karen S.; Rummler, Geary A.
1979-01-01
Presents general systems models (ballistic system, guided system, and adaptive system) and an evaluation matrix to help in examining training evaluation alternatives and in deciding what evaluation is appropriate. Includes some guidelines for conducting evaluation studies using four designs (control group, reversal, multiple baseline, and…
ERIC Educational Resources Information Center
Mechling, Linda C.; Ayres, Kevin M.; Foster, Ashley L.; Bryant, Kathryn J.
2015-01-01
The purpose of this study was to evaluate the ability of four high school-aged students with a diagnosis of autism spectrum disorder and moderate intellectual disability to generalize performance of skills when using materials different from those presented through video models. An adapted alternating treatments design was used to evaluate student…
Relevance of the c-statistic when evaluating risk-adjustment models in surgery.
Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y
2012-05-01
The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.
Hallinan, Christine M
2010-01-01
In this paper, program logic will be used to 'map out' the planning, development and evaluation of the general practice Pap nurse program in the Australian general practice arena. The incorporation of program logic into the evaluative process supports a greater appreciation of the theoretical assumptions and external influences that underpin general practice Pap nurse activity. The creation of a program logic model is a conscious strategy that results an explicit understanding of the challenges ahead, the resources available and time frames for outcomes. Program logic also enables a recognition that all players in the general practice arena need to be acknowledged by policy makers, bureaucrats and program designers when addressing through policy, issues relating to equity and accessibility of health initiatives. Logic modelling allows decision makers to consider the complexities of causal associations when developing health care proposals and programs. It enables the Pap nurse in general practice program to be represented diagrammatically by linking outcomes (short, medium and long term) with both the program activities and program assumptions. The research methodology used in the evaluation of the Pap nurse in general practice program includes a descriptive study design and the incorporation of program logic, with a retrospective analysis of Australian data from 2001 to 2009. For the purposes of gaining both empirical and contextual data for this paper, a data set analysis and literature review was performed. The application of program logic as an evaluative tool for analysis of the Pap PN incentive program facilitates a greater understanding of complex general practice activity triggers, and also allows this greater understanding to be incorporated into policy to facilitate Pap PN activity, increase general practice cervical smear and ultimately decrease burden of disease.
DOT National Transportation Integrated Search
2001-03-05
A systems modeling approach is presented for assessment of harm in the automotive accident environment. The methodology is presented in general form and then applied to evaluate vehicle aggressivity in frontal crashes. The methodology consists of par...
2010-01-01
the 1/8°climatological monthly mean temperature and salinity fields from the Generalized Digital Environmental Model ( GDEM ) climatology (NAVOCEANO...1424. Page 24 of 51 NAVOCEANO, 2003. Database description for the generalized digital environmental model ( GDEM --V) Version 3.0. OAML--DBD--72
Alexandrowicz, Rainer W; Friedrich, Fabian; Jahn, Rebecca; Soulier, Nathalie
2015-01-01
The present study compares the 30-, 20-, and 12-items versions of the General Health Questionnaire (GHQ) in the original coding and four different recoding schemes (Bimodal, Chronic, Modified Likert and a newly proposed Modified Chronic) with respect to their psychometric qualities. The dichotomized versions (i.e. Bimodal, Chronic and Modified Chronic) were evaluated with the Rasch-Model and the polytomous original version and the Modified Likert version were evaluated with the Partial Credit Model. In general, the versions under consideration showed agreement with the model assumption. However, the recoded versions exhibited some deficits with respect to the Outfit index. Because of the item deficits and for theoretical reasons we argue in favor of using the any of the three length versions with the original four-categorical coding scheme. Nevertheless, any of the versions appears apt for clinical use from a psychometric perspective.
Using Survival Analysis to Improve Estimates of Life Year Gains in Policy Evaluations.
Meacock, Rachel; Sutton, Matt; Kristensen, Søren Rud; Harrison, Mark
2017-05-01
Policy evaluations taking a lifetime horizon have converted estimated changes in short-term mortality to expected life year gains using general population life expectancy. However, the life expectancy of the affected patients may differ from the general population. In trials, survival models are commonly used to extrapolate life year gains. The objective was to demonstrate the feasibility and materiality of using parametric survival models to extrapolate future survival in health care policy evaluations. We used our previous cost-effectiveness analysis of a pay-for-performance program as a motivating example. We first used the cohort of patients admitted prior to the program to compare 3 methods for estimating remaining life expectancy. We then used a difference-in-differences framework to estimate the life year gains associated with the program using general population life expectancy and survival models. Patient-level data from Hospital Episode Statistics was utilized for patients admitted to hospitals in England for pneumonia between 1 April 2007 and 31 March 2008 and between 1 April 2009 and 31 March 2010, and linked to death records for the period from 1 April 2007 to 31 March 2011. In our cohort of patients, using parametric survival models rather than general population life expectancy figures reduced the estimated mean life years remaining by 30% (9.19 v. 13.15 years, respectively). However, the estimated mean life year gains associated with the program are larger using survival models (0.380 years) compared to using general population life expectancy (0.154 years). Using general population life expectancy to estimate the impact of health care policies can overestimate life expectancy but underestimate the impact of policies on life year gains. Using a longer follow-up period improved the accuracy of estimated survival and program impact considerably.
NASA Astrophysics Data System (ADS)
Samanta, Rome; Chakraborty, Mainak; Ghosal, Ambar
2016-03-01
We evaluate the Majorana phases for a general 3 × 3 complex symmetric neutrino mass matrix on the basis of Mohapatra-Rodejohann's phase convention using the three rephasing invariant quantities I12, I13 and I23 proposed by Sarkar and Singh. We find them interesting as they allow us to evaluate each Majorana phase in a model independent way even if one eigenvalue is zero. Utilizing the solution of a general complex symmetric mass matrix for eigenvalues and mixing angles we determine the Majorana phases for both the hierarchies, normal and inverted, taking into account the constraints from neutrino oscillation global fit data as well as bound on the sum of the three light neutrino masses (Σimi) and the neutrinoless double beta decay (ββ0ν) parameter |m11 |. This methodology of finding the Majorana phases is applied thereafter in some predictive models for both the hierarchical cases (normal and inverted) to evaluate the corresponding Majorana phases and it is shown that all the sub cases presented in inverted hierarchy section can be realized in a model with texture zeros and scaling ansatz within the framework of inverse seesaw although one of the sub cases following the normal hierarchy is yet to be established. Except the case of quasi degenerate neutrinos, the methodology obtained in this work is able to evaluate the corresponding Majorana phases, given any model of neutrino masses.
NASA Astrophysics Data System (ADS)
Shiri, Jalal; Nazemi, Amir Hossein; Sadraddini, Ali Ashraf; Landeras, Gorka; Kisi, Ozgur; Fard, Ahmad Fakheri; Marti, Pau
2013-02-01
SummaryAccurate estimation of reference evapotranspiration is important for irrigation scheduling, water resources management and planning and other agricultural water management issues. In the present paper, the capabilities of generalized neuro-fuzzy models were evaluated for estimating reference evapotranspiration using two separate sets of weather data from humid and non-humid regions of Spain and Iran. In this way, the data from some weather stations in the Basque Country and Valencia region (Spain) were used for training the neuro-fuzzy models [in humid and non-humid regions, respectively] and subsequently, the data from these regions were pooled to evaluate the generalization capability of a general neuro-fuzzy model in humid and non-humid regions. The developed models were tested in stations of Iran, located in humid and non-humid regions. The obtained results showed the capabilities of generalized neuro-fuzzy model in estimating reference evapotranspiration in different climatic zones. Global GNF models calibrated using both non-humid and humid data were found to successfully estimate ET0 in both non-humid and humid regions of Iran (the lowest MAE values are about 0.23 mm for non-humid Iranian regions and 0.12 mm for humid regions). non-humid GNF models calibrated using non-humid data performed much better than the humid GNF models calibrated using humid data in non-humid region while the humid GNF model gave better estimates in humid region.
Examination of a Bifactor Model of Obsessive-Compulsive Symptom Dimensions.
Olatunji, Bunmi O; Ebesutani, Chad; Abramowitz, Jonathan S
2017-01-01
Although obsessive-compulsive (OC) symptoms are observed along four dimensions (contamination, responsibility for harm, order/symmetry, and unacceptable thoughts), the structure of the dimensions remains unclear. The current study evaluated a bifactor model of OC symptoms among those with and without obsessive-compulsive disorder (OCD). The goals were (a) to evaluate if OC symptoms should be conceptualized as unidimensional or whether distinct dimensions should be interpreted and (b) to use structural equation modeling to examine the convergence of the OC dimensions above and beyond a general dimension with related criteria. Results revealed that a bifactor model fit the data well and that OC symptoms were influenced by a general dimension and by four dimensions. Measurement invariance of the bifactor model was also supported among those with and without OCD. However, the general OC dimension accounted for only half of the variability in OC symptoms, with the remaining variability accounted for by distinct dimensions. Despite evidence of multidimensionality, the dimensions were unreliable after covarying for the general OC dimension. However, the four dimensions did significantly converge with a latent OC spectrum factor above and beyond the general OC dimension. The implications of these findings for conceptualizing the structure of OCD are discussed. © The Author(s) 2015.
ERIC Educational Resources Information Center
Hannon, Peggy; Umble, Karl E.; Alexander, Lorraine; Francisco, Don; Steckler, Allan; Tudor, Gai; Upshaw, Vaughn
2002-01-01
Student evaluations of an online public health curriculum developed using the instructional models of Gagne and Laurillard indicated that students were generally satisfied with the experience. However, some dissatisfaction with the feedback and guidance they received supported Laurillard's model. Comments revealed an aspect of the learning…
ERIC Educational Resources Information Center
Allodi, Mara Westling
2010-01-01
This paper defines a broad model of the psychosocial climate in educational settings. The model was developed from a general theory of learning environments, on a theory of human values and on empirical studies of children's evaluations of their schools. The contents of the model are creativity, stimulation, achievement, self-efficacy, creativity,…
Evaluation of the CORDEX-Africa multi-RCM hindcast: systematic model errors
NASA Astrophysics Data System (ADS)
Kim, J.; Waliser, Duane E.; Mattmann, Chris A.; Goodale, Cameron E.; Hart, Andrew F.; Zimdars, Paul A.; Crichton, Daniel J.; Jones, Colin; Nikulin, Grigory; Hewitson, Bruce; Jack, Chris; Lennard, Christopher; Favre, Alice
2014-03-01
Monthly-mean precipitation, mean (TAVG), maximum (TMAX) and minimum (TMIN) surface air temperatures, and cloudiness from the CORDEX-Africa regional climate model (RCM) hindcast experiment are evaluated for model skill and systematic biases. All RCMs simulate basic climatological features of these variables reasonably, but systematic biases also occur across these models. All RCMs show higher fidelity in simulating precipitation for the west part of Africa than for the east part, and for the tropics than for northern Sahara. Interannual variation in the wet season rainfall is better simulated for the western Sahel than for the Ethiopian Highlands. RCM skill is higher for TAVG and TMAX than for TMIN, and regionally, for the subtropics than for the tropics. RCM skill in simulating cloudiness is generally lower than for precipitation or temperatures. For all variables, multi-model ensemble (ENS) generally outperforms individual models included in ENS. An overarching conclusion in this study is that some model biases vary systematically for regions, variables, and metrics, posing difficulties in defining a single representative index to measure model fidelity, especially for constructing ENS. This is an important concern in climate change impact assessment studies because most assessment models are run for specific regions/sectors with forcing data derived from model outputs. Thus, model evaluation and ENS construction must be performed separately for regions, variables, and metrics as required by specific analysis and/or assessments. Evaluations using multiple reference datasets reveal that cross-examination, quality control, and uncertainty estimates of reference data are crucial in model evaluations.
NASA Astrophysics Data System (ADS)
Frost, Andrew J.; Thyer, Mark A.; Srikanthan, R.; Kuczera, George
2007-07-01
SummaryMulti-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box-Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney's main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box-Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.
Impact of acute care surgery to departmental productivity.
Barnes, Stephen L; Cooper, Christopher J; Coughenour, Jeffrey P; MacIntyre, Allan D; Kessel, James W
2011-10-01
The face of trauma surgery is rapidly evolving with a paradigm shift toward acute care surgery (ACS). The formal development of ACS has been viewed by some general surgeons as a threat to their practice. We sought to evaluate the impact of a new division of ACS to both departmental productivity and provider satisfaction at a University Level I Trauma Center. Two-year retrospective analysis of annual work relative value unit (wRVU) productivity, operative volume, and FTEs before and after establishment of an ACS division at a University Level I trauma center. Provider satisfaction was measured using a 10-point scale. Analysis completed using Microsoft Excel with a p value less than 0.05 significant. The change to an ACS model resulted in a 94% increase in total wRVU production (78% evaluation and management, 122% operative; p<0.05) for ACS, whereas general surgery wRVU production increased 8% (-15% evaluation and management, 14% operative; p<0.05). Operative productivity was substantial after transition to ACS, with 129% and 44% increases (p<0.05) in operative and elective case load, respectively. Decline in overall general surgery operative volume was attributed to reduction in emergent cases. Establishment of the ACS model necessitated one additional FTE. Job satisfaction substantially improved with the ACS model while allowing general surgery a more focused practice. The ACS practice model significantly enhances provider productivity and job satisfaction when compared with trauma alone. Fears of a productivity impact to the nontrauma general surgeon were not realized.
Raymond L. Czaplewski
1973-01-01
A generalized, non-linear population dynamics model of an ecosystem is used to investigate the direction of selective pressures upon a mutant by studying the competition between parent and mutant populations. The model has the advantages of considering selection as operating on the phenotype, of retaining the interaction of the mutant population with the ecosystem as a...
NASA Astrophysics Data System (ADS)
Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.
2010-04-01
An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.
NASA Astrophysics Data System (ADS)
Wei, Y.; Chen, X.
2017-12-01
We present a first description and evaluation of the IAP Atmospheric Aerosol Chemistry Model (IAP-AACM) which has been integrated into the earth system model CAS-ESM. In this way it is possible to research into interaction of clouds and aerosol by its two-way coupling with the IAP Atmospheric General Circulation Model (IAP-AGCM). The model has a nested global-regional grid based on the Global Environmental Atmospheric Transport Model (GEATM) and the Nested Air Quality Prediction Modeling System (NAQPMS). The AACM provides two optional gas chemistry schemes, the CBM-Z gas chemistry as well as a sulfur oxidize box designed specifically for the CAS-ESM. Now the model driven by AGCM has been applied to a 1-year simulation of tropospheric chemistry both on global and regional scales for 2014, and been evaluated against various observation datasets, including aerosol precursor gas concentration, aerosol mass and number concentrations. Furthermore, global budgets in AACM are compared with other global aerosol models. Generally, the AACM simulations are within the range of other global aerosol model predictions, and the model has a reasonable agreement with observations of gases and particles concentration both on global and regional scales.
NASA Astrophysics Data System (ADS)
Sakuma, Jun; Wright, Rebecca N.
Privacy-preserving classification is the task of learning or training a classifier on the union of privately distributed datasets without sharing the datasets. The emphasis of existing studies in privacy-preserving classification has primarily been put on the design of privacy-preserving versions of particular data mining algorithms, However, in classification problems, preprocessing and postprocessing— such as model selection or attribute selection—play a prominent role in achieving higher classification accuracy. In this paper, we show generalization error of classifiers in privacy-preserving classification can be securely evaluated without sharing prediction results. Our main technical contribution is a new generalized Hamming distance protocol that is universally applicable to preprocessing and postprocessing of various privacy-preserving classification problems, such as model selection in support vector machine and attribute selection in naive Bayes classification.
Advanced Actuation Systems Development. Volume 2
1989-08-01
and unloaded performance characteristics of a test specimen produced by General Dynamics Corporation as a feasibility model. The actuation system for...changing the camber of the test specimen is unique and was evaluated with a series of input/output measurements. The testing verified the general ...MAWS General ’rest Procedure........................................6 General Performance Measurements .................................... 10 Test
Implementation of a Smeared Crack Band Model in a Micromechanics Framework
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.
2012-01-01
The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.
Investigating Island Evolution: A Galapagos-Based Lesson Using the 5E Instructional Model.
ERIC Educational Resources Information Center
DeFina, Anthony V.
2002-01-01
Introduces an inquiry-based lesson plan on evolution and the Galapagos Islands. Uses the 5E instructional model which includes phases of engagement, exploration, explanation, elaboration, and evaluation. Includes information on species for exploration and elaboration purposes, and a general rubric for student evaluation. (YDS)
ERIC Educational Resources Information Center
Marston, Doug; Muyskens, Paul; Lau, Matthew; Canter, Andrea
2003-01-01
This article describes the problem-solving model (PSM) used in the Minneapolis Public Schools to guide decisions regarding intervention in general education, special education referral, and evaluation for special education eligibility for high-incidence disabilities. Program evaluation indicates students received special education services earlier…
Decision Support Tool Evaluation Report for General NOAA Oil Modeling Environment(GNOME) Version 2.0
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Hall, Callie; Zanoni, Vicki; Blonski, Slawomir; D'Sa, Eurico; Estep, Lee; Holland, Donald; Moore, Roxzana F.; Pagnutti, Mary; Terrie, Gregory
2004-01-01
NASA's Earth Science Applications Directorate evaluated the potential of NASA remote sensing data and modeling products to enhance the General NOAA Oil Modeling Environment (GNOME) decision support tool. NOAA's Office of Response and Restoration (OR&R) Hazardous Materials (HAZMAT) Response Division is interested in enhancing GNOME with near-realtime (NRT) NASA remote sensing products on oceanic winds and ocean circulation. The NASA SeaWinds sea surface wind and Jason-1 sea surface height NRT products have potential, as do sea surface temperature and reflectance products from the Moderate Resolution Imaging Spectroradiometer and sea surface reflectance products from Landsat and the Advanced Spaceborne Thermal Emission and Reflectance Radiometer. HAZMAT is also interested in the Advanced Circulation model and the Ocean General Circulation Model. Certain issues must be considered, including lack of data continuity, marginal data redundancy, and data formatting problems. Spatial resolution is an issue for near-shore GNOME applications. Additional work will be needed to incorporate NASA inputs into GNOME, including verification and validation of data products, algorithms, models, and NRT data.
Modelling and economic evaluation of forest biome shifts under climate change in Southwest Germany
Marc Hanewinkel; Susan Hummel; Dominik Cullmann
2010-01-01
We evaluated the economic effects of a predicted shift from Norway spruce (Picea abies) to European beech (Fagus sylvatica) for a forest area of 1.3 million ha in southwest Germany. The shift was modelled with a generalized linear model (GLM) by using presence/absence data from the National Forest Inventory in Baden-Wurttemberg...
ERIC Educational Resources Information Center
Boie, Ioana; Lopez, Anna L.; Sass, Daniel A.
2013-01-01
This study evaluated a model linking internalization and dieting behaviors in a sample ("n" = 499) of Latina/o and White college students. Analyses revealed that the scales were invariant across ethnic and gender groups and generally supported the invariance of the proposed model across these groups. Analyses also revealed no ethnic mean…
Training Evaluation: An Analysis of the Stakeholders' Evaluation Needs
ERIC Educational Resources Information Center
Guerci, Marco; Vinante, Marco
2011-01-01
Purpose: In recent years, the literature on program evaluation has examined multi-stakeholder evaluation, but training evaluation models and practices have not generally taken this problem into account. The aim of this paper is to fill this gap. Design/methodology/approach: This study identifies intersections between methodologies and approaches…
Cairney, John; Hay, John A; Faught, Brent E; Wade, Terrance J; Corna, Laurie; Flouris, Andreas
2005-10-01
To test a theoretical model linking developmental coordination disorder (DCD) to reduced physical activity (PA) through the mediating influence of generalized self-efficacy regarding PA. This was a cross-sectional investigation of students in grades 4 through 8 from 5 elementary schools in the Niagara region of Ontario, Canada (n=590). Motor proficiency was evaluated using the short-form Bruininks-Oseretsky Test of Motor Proficiency. Generalized self-efficacy was assessed using the Children's Self-Perceptions of Adequacy in and Predilection for Physical Activity scale, and PA levels were evaluated using a 61-item Participation Questionnaire. Structural equation modeling was used to test the influence of generalized self-efficacy on the relationship between DCD and PA. In this sample, 7.5% (n=44) of the children met the requirements for probable DCD. The effect of DCD on PA was mediated by generalized self-efficacy. In this model, 28% of the variance in children's PA was predicted by generalized self-efficacy and DCD. Our results suggest that children with DCD are less likely to be physically active and that generalized self-efficacy can account for a considerable proportion of this relationship. The implications for appropriate interventions to increase PA among children with DCD are discussed.
Watershed scale response to climate change--Yampa River Basin, Colorado
Hay, Lauren E.; Battaglin, William A.; Markstrom, Steven L.
2012-01-01
General Circulation Model simulations of future climate through 2099 project a wide range of possible scenarios. To determine the sensitivity and potential effect of long-term climate change on the freshwater resources of the United States, the U.S. Geological Survey Global Change study, "An integrated watershed scale response to global change in selected basins across the United States" was started in 2008. The long-term goal of this national study is to provide the foundation for hydrologically based climate change studies across the nation. Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Yampa River Basin at Steamboat Springs, Colorado.
ERIC Educational Resources Information Center
Scher, Julian M.
2010-01-01
Information Systems instructors are generally encouraged to introduce team projects into their pedagogy, with a consequential issue of objectively evaluating the performance of each individual team member. The concept of "freeloading" is well-known for team projects, and for this, and other reasons, a peer review process of team members,…
A risk evaluation model and its application in online retailing trustfulness
NASA Astrophysics Data System (ADS)
Ye, Ruyi; Xu, Yingcheng
2017-08-01
Building a general model for risks evaluation in advance could improve the convenience, normality and comparability of the results of repeating risks evaluation in the case that the repeating risks evaluating are in the same area and for a similar purpose. One of the most convenient and common risks evaluation models is an index system including of several index, according weights and crediting method. One method to build a risk evaluation index system that guarantees the proportional relationship between the resulting credit and the expected risk loss is proposed and an application example is provided in online retailing in this article.
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi
This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process. The objective of this study is to verify the generalized body-modes approach in comparison to high-fidelity FSI simulations to accurately predict structural deflections and stress loads in a WEC. Two verification cases are considered, a free-floating barge and a fixed-bottom column. Details for both the generalized body-modes models and FSI models are first provided. Results for each of the models are then compared and discussed. Finally, based on the verification results obtained, future plans for incorporating the generalized body-modes method into the WEC simulation tool, WEC-Sim, and the overall WEC design process are discussed.« less
Durrett, Christine; Trull, Timothy J
2005-09-01
Two personality models are compared regarding their relationship with personality disorder (PD) symptom counts and with lifetime Axis I diagnoses. These models share 5 similar domains, and the Big 7 model also includes 2 domains assessing self-evaluation: positive and negative valence. The Big 7 model accounted for more variance in PDs than the 5-factor model, primarily because of the association of negative valence with most PDs. Although low-positive valence was associated with most Axis I diagnoses, the 5-factor model generally accounted for more variance in Axis I diagnoses than the Big 7 model. Some predicted associations between self-evaluation and psychopathology were not found, and unanticipated associations emerged. These findings are discussed regarding the utility of evaluative terms in clinical assessment.
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.
2012-01-01
The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.
Cevenini, Gabriele; Barbini, Emanuela; Scolletta, Sabino; Biagioli, Bonizella; Giomarelli, Pierpaolo; Barbini, Paolo
2007-11-22
Popular predictive models for estimating morbidity probability after heart surgery are compared critically in a unitary framework. The study is divided into two parts. In the first part modelling techniques and intrinsic strengths and weaknesses of different approaches were discussed from a theoretical point of view. In this second part the performances of the same models are evaluated in an illustrative example. Eight models were developed: Bayes linear and quadratic models, k-nearest neighbour model, logistic regression model, Higgins and direct scoring systems and two feed-forward artificial neural networks with one and two layers. Cardiovascular, respiratory, neurological, renal, infectious and hemorrhagic complications were defined as morbidity. Training and testing sets each of 545 cases were used. The optimal set of predictors was chosen among a collection of 78 preoperative, intraoperative and postoperative variables by a stepwise procedure. Discrimination and calibration were evaluated by the area under the receiver operating characteristic curve and Hosmer-Lemeshow goodness-of-fit test, respectively. Scoring systems and the logistic regression model required the largest set of predictors, while Bayesian and k-nearest neighbour models were much more parsimonious. In testing data, all models showed acceptable discrimination capacities, however the Bayes quadratic model, using only three predictors, provided the best performance. All models showed satisfactory generalization ability: again the Bayes quadratic model exhibited the best generalization, while artificial neural networks and scoring systems gave the worst results. Finally, poor calibration was obtained when using scoring systems, k-nearest neighbour model and artificial neural networks, while Bayes (after recalibration) and logistic regression models gave adequate results. Although all the predictive models showed acceptable discrimination performance in the example considered, the Bayes and logistic regression models seemed better than the others, because they also had good generalization and calibration. The Bayes quadratic model seemed to be a convincing alternative to the much more usual Bayes linear and logistic regression models. It showed its capacity to identify a minimum core of predictors generally recognized as essential to pragmatically evaluate the risk of developing morbidity after heart surgery.
NASA Astrophysics Data System (ADS)
Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.
2018-01-01
The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.
A General Accelerated Degradation Model Based on the Wiener Process.
Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning
2016-12-06
Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.
A General Accelerated Degradation Model Based on the Wiener Process
Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning
2016-01-01
Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cimpoesu, Dorin, E-mail: cdorin@uaic.ro; Stoleriu, Laurentiu; Stancu, Alexandru
2013-12-14
We propose a generalized Stoner-Wohlfarth (SW) type model to describe various experimentally observed angular dependencies of the switching field in non-single-domain magnetic particles. Because the nonuniform magnetic states are generally characterized by complicated spin configurations with no simple analytical description, we maintain the macrospin hypothesis and we phenomenologically include the effects of nonuniformities only in the anisotropy energy, preserving as much as possible the elegance of SW model, the concept of critical curve and its geometric interpretation. We compare the results obtained with our model with full micromagnetic simulations in order to evaluate the performance and limits of our approach.
The general situation, (but exemplified in urban areas), where a significant degree of sub-grid variability (SGV) exists in grid models poses problems when comparing gridbased air quality modeling results with observations. Typically, grid models ignore or parameterize processes ...
NASA Technical Reports Server (NTRS)
Poole, L. R.; Huckins, E. K., III
1972-01-01
A general theory on mathematical modeling of elastic parachute suspension lines during the unfurling process was developed. Massless-spring modeling of suspension-line elasticity was evaluated in detail. For this simple model, equations which govern the motion were developed and numerically integrated. The results were compared with flight test data. In most regions, agreement was satisfactory. However, poor agreement was obtained during periods of rapid fluctuations in line tension.
Quasar microlensing models with constraints on the Quasar light curves
NASA Astrophysics Data System (ADS)
Tie, S. S.; Kochanek, C. S.
2018-01-01
Quasar microlensing analyses implicitly generate a model of the variability of the source quasar. The implied source variability may be unrealistic yet its likelihood is generally not evaluated. We used the damped random walk (DRW) model for quasar variability to evaluate the likelihood of the source variability and applied the revized algorithm to a microlensing analysis of the lensed quasar RX J1131-1231. We compared estimates of the size of the quasar disc and the average stellar mass of the lens galaxy with and without applying the DRW likelihoods for the source variability model and found no significant effect on the estimated physical parameters. The most likely explanation is that unreliastic source light-curve models are generally associated with poor microlensing fits that already make a negligible contribution to the probability distributions of the derived parameters.
Editorial: Cognitive Architectures, Model Comparison and AGI
NASA Astrophysics Data System (ADS)
Lebiere, Christian; Gonzalez, Cleotilde; Warwick, Walter
2010-12-01
Cognitive Science and Artificial Intelligence share compatible goals of understanding and possibly generating broadly intelligent behavior. In order to determine if progress is made, it is essential to be able to evaluate the behavior of complex computational models, especially those built on general cognitive architectures, and compare it to benchmarks of intelligent behavior such as human performance. Significant methodological challenges arise, however, when trying to extend approaches used to compare model and human performance from tightly controlled laboratory tasks to complex tasks involving more open-ended behavior. This paper describes a model comparison challenge built around a dynamic control task, the Dynamic Stocks and Flows. We present and discuss distinct approaches to evaluating performance and comparing models. Lessons drawn from this challenge are discussed in light of the challenge of using cognitive architectures to achieve Artificial General Intelligence.
Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study
NASA Astrophysics Data System (ADS)
Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana
The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.
Selecting Meteorological Input for the Global Modeling Initiative Assessments
NASA Technical Reports Server (NTRS)
Strahan, Susan; Douglass, Anne; Prather, Michael; Coy, Larry; Hall, Tim; Rasch, Phil; Sparling, Lynn
1999-01-01
The Global Modeling Initiative (GMI) science team has developed a three dimensional chemistry and transport model (CTM) to evaluate the impact of the exhaust of supersonic aircraft on the stratosphere. An important goal of the GMI is to test modules for numerical transport, photochemical integration, and model dynamics within a common framework. This work is focussed on the dependence of the overall assessment on the wind and temperature fields used by the CTM. Three meteorological data sets for the stratosphere were available to GMI: the National Center for Atmospheric Research Community Climate Model (CCM2), the Goddard Earth Observing System Data Assimilation System (GEOS-DAS), and the Goddard Institute for Space Studies general circulation model (GISS-2'). Objective criteria were established by the GMI team to evaluate which of these three data sets provided the best representation of trace gases in the stratosphere today. Tracer experiments were devised to test various aspects of model transport. Stratospheric measurements of long-lived trace gases were selected as a test of the CTM transport. This presentation describes the criteria used in grading the meteorological fields and the resulting choice of wind fields to be used in the GMI assessment. This type of objective model evaluation will lead to a higher level of confidence in these assessments. We suggest that the diagnostic tests shown here be used to augment traditional general circulation model evaluation methods.
Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J
2016-08-20
Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and can comprehensively and accurately evaluate occupational health risk caused by DMF.
NASA Technical Reports Server (NTRS)
Miller, R. D.; Rogers, J. T.
1975-01-01
General requirements for dynamic loads analyses are described. The indicial lift growth function unsteady subsonic aerodynamic representation is reviewed, and the FLEXSTAB CPS is evaluated with respect to these general requirements. The effects of residual flexibility techniques on dynamic loads analyses are also evaluated using a simple dynamic model.
scoringRules - A software package for probabilistic model evaluation
NASA Astrophysics Data System (ADS)
Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian
2016-04-01
Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.
Focus on Learning: A Report on Reorganizing General and Special Education in New York City.
ERIC Educational Resources Information Center
Fruchter, Norm; And Others
This report is the result of a year-long evaluation of special education in New York City (New York) and presents major recommendations for reorganizing general and special education. It proposes a school-based model with an integrated general/special education system, and use of an enrichment allocation from merged special and general education…
[Decision modeling for economic evaluation of health technologies].
de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh
2014-10-01
Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.
Systems of attitudes towards production in the pork industry. A cross-national study.
Sørensen, Bjarne Taulo; Barcellos, Marcia Dutra de; Olsen, Nina Veflen; Verbeke, Wim; Scholderer, Joachim
2012-12-01
Existing research on public attitudes towards agricultural production systems is largely descriptive, abstracting from the processes through which members of the general public generate their evaluations of such systems. The present paper adopts a systems perspective on such evaluations, understanding them as embedded into a wider attitude system that consists of attitudes towards objects of different abstraction levels, ranging from personal value orientations over general socio-political attitudes to evaluations of specific characteristics of agricultural production systems. It is assumed that evaluative affect propagates through the system in such a way that the system becomes evaluatively consistent and operates as a schema for the generation of evaluative judgments. In the empirical part of the paper, the causal structure of an attitude system from which people derive their evaluations of pork production systems was modelled. The analysis was based on data from a cross-national survey involving 1931 participants from Belgium, Denmark, Germany and Poland. The survey questionnaire contained measures of personal value orientations and attitudes towards environment and nature, industrial food production, food and the environment, technological progress, animal welfare, local employment and local economy. In addition, the survey included a conjoint task by which participants' evaluations of the importance of production system attributes were measured. The data were analysed by means of causal search algorithms and structural equation models. The results suggest that evaluative judgments of the importance of pork production system attributes are generated in a schematic manner, driven by personal value orientations. The effect of personal value orientations was strong and largely unmediated by attitudes of an intermediate level of generality, suggesting that the dependent variables in the particular attitude system that was modelled here can be understood as value judgments in a literal sense. Copyright © 2012. Published by Elsevier Ltd.
An evaluation of soil moisture models for countermine application
NASA Astrophysics Data System (ADS)
Mason, George L.
2004-09-01
The focus of this study is the evaluation of emerging soil moisture models as they apply to infrared, radar, and acoustic sensors within the scope of countermine operations. Physical, chemical, and biological processes changing the signature of the ground are considered. The available models were not run in-house, but were evaluated by the theory by which they were constructed and the supporting documentation. The study was conducted between September and October of 2003 and represents a subset of existing models. The objective was to identify those models suited for simulation, define the general constraints of the models, and summarize the emerging functionalities which would support sensor modeling for mine detection.
A computer program for condensing heat exchanger performance in the presence of noncondensable gases
NASA Technical Reports Server (NTRS)
Yendler, Boris
1994-01-01
A computer model has been developed which evaluates the performance of a heat exchanger. This model is general enough to be used to evaluate many heat exchanger geometries and a number of different operating conditions. The film approach is used to describe condensation in the presence of noncondensables. The model is also easily expanded to include other effects like fog formation or suction.
ERIC Educational Resources Information Center
Lee, Linda
2011-01-01
The policy discourse on improving student achievement has shifted from student outcomes to focusing on evaluating teacher effectiveness using standardized test scores. A major urban newspaper released a public database that ranked teachers' effectiveness using Value-Added Modeling. Teachers, whom are generally marginalized, were given the…
ERIC Educational Resources Information Center
Rosales, Rocío; Gongola, Leah; Homlitas, Christa
2015-01-01
A multiple baseline design across participants was used to evaluate the effects of video modeling with embedded instructions on training teachers to implement 3 preference assessments. Each assessment was conducted with a confederate learner or a child with autism during generalization probes. All teachers met the predetermined mastery criterion,…
A Formative Evaluation of the Children, Youth, and Families at Risk Coaching Model
ERIC Educational Resources Information Center
Olson, Jonathan R.; Smith, Burgess; Hawkey, Kyle R.; Perkins, Daniel F.; Borden, Lynne M.
2016-01-01
In this article, we describe the results of a formative evaluation of a coaching model designed to support recipients of funding through the Children, Youth, and Families at Risk (CYFAR) initiative. Results indicate that CYFAR coaches draw from a variety of types of coaching and that CYFAR principle investigators (PIs) are generally satisfied with…
Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models
NASA Astrophysics Data System (ADS)
Starn, J. J.; Belitz, K.
2014-12-01
National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions. Convolution with age distributions can be used to quickly ascertain likely future water-quality conditions. Although these models are admittedly very general and are still being tested, the hope is that they will be useful for answering questions related to water quality at the regional scale.
Managing for efficiency in health care: the case of Greek public hospitals.
Mitropoulos, Panagiotis; Mitropoulos, Ioannis; Sissouras, Aris
2013-12-01
This paper evaluates the efficiency of public hospitals with two alternative conceptual models. One model targets resource usage directly to assess production efficiency, while the other model incorporates financial results to assess economic efficiency. Performance analysis of these models was conducted in two stages. In stage one, we utilized data envelopment analysis to obtain the efficiency score of each hospital, while in stage two we took into account the influence of the operational environment on efficiency by regressing those scores on explanatory variables that concern the performance of hospital services. We applied these methods to evaluate 96 general hospitals in the Greek national health system. The results indicate that, although the average efficiency scores in both models have remained relatively stable compared to past assessments, internal changes in hospital performances do exist. This study provides a clear framework for policy implications to increase the overall efficiency of general hospitals.
Zhou, Xiangrong; Xu, Rui; Hara, Takeshi; Hirano, Yasushi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Kido, Shoji; Fujita, Hiroshi
2014-07-01
The shapes of the inner organs are important information for medical image analysis. Statistical shape modeling provides a way of quantifying and measuring shape variations of the inner organs in different patients. In this study, we developed a universal scheme that can be used for building the statistical shape models for different inner organs efficiently. This scheme combines the traditional point distribution modeling with a group-wise optimization method based on a measure called minimum description length to provide a practical means for 3D organ shape modeling. In experiments, the proposed scheme was applied to the building of five statistical shape models for hearts, livers, spleens, and right and left kidneys by use of 50 cases of 3D torso CT images. The performance of these models was evaluated by three measures: model compactness, model generalization, and model specificity. The experimental results showed that the constructed shape models have good "compactness" and satisfied the "generalization" performance for different organ shape representations; however, the "specificity" of these models should be improved in the future.
Program Impact Evaluations: An Introduction for Managers of Title VII Projects. A Draft Guidebook.
ERIC Educational Resources Information Center
Bissell, Joan S.
Intended to assist administrators in the planning, management, and utilization of evaluation, this guidebook is designed as an introduction and supplement to other evaluation materials for bilingual education programs being developed under federal sponsorship, including evaluation models for Title VII projects. General information is provided on…
2015-03-01
entrance were evaluated on their ability to reduce potential impacts of waves and currents on wet- lands. Study results indicated all three proposed...transport de- veloped were used in the evaluation of proposed solutions. The prelimi- nary modeling results helped to assess general sediment pattern...Corps of Engineers (USACE), Buffalo Dis- trict, is conducting a study to evaluate shoreline protection measures for coastal wetlands at Braddock Bay
Application of SIGGS to Project PRIME: A General Systems Approach to Evaluation of Mainstreaming.
ERIC Educational Resources Information Center
Frick, Ted
The use of the systems approach in educational inquiry is not new, and the models of input/output, input/process/product, and cybernetic systems have been widely used. The general systems model is an extension of all these, adding the dimension of environmental influence on the system as well as system influence on the environment. However, if the…
2001-10-25
Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for
Miñano Pérez, Pablo; Castejón Costa, Juan-Luis; Gilar Corbí, Raquel
2012-03-01
As a result of studies examining factors involved in the learning process, various structural models have been developed to explain the direct and indirect effects that occur between the variables in these models. The objective was to evaluate a structural model of cognitive and motivational variables predicting academic achievement, including general intelligence, academic self-concept, goal orientations, effort and learning strategies. The sample comprised of 341 Spanish students in the first year of compulsory secondary education. Different tests and questionnaires were used to evaluate each variable, and Structural Equation Modelling (SEM) was applied to contrast the relationships of the initial model. The model proposed had a satisfactory fit, and all the hypothesised relationships were significant. General intelligence was the variable most able to explain academic achievement. Also important was the direct influence of academic self-concept on achievement, goal orientations and effort, as well as the mediating ability of effort and learning strategies between academic goals and final achievement.
Daly, Shaun C; Deal, Rebecca A; Rinewalt, Daniel E; Francescatti, Amanda B; Luu, Minh B; Millikan, Keith W; Anderson, Mary C; Myers, Jonathan A
2014-04-01
The purpose of our study was to determine the predictive impact of individual academic measures for the matriculation of senior medical students into a general surgery residency. Academic records were evaluated for third-year medical students (n = 781) at a single institution between 2004 and 2011. Cohorts were defined by student matriculation into either a general surgery residency program (n = 58) or a non-general surgery residency program (n = 723). Multivariate logistic regression was performed to evaluate independently significant academic measures. Clinical evaluation raw scores were predictive of general surgery matriculation (P = .014). In addition, multivariate modeling showed lower United States Medical Licensing Examination Step 1 scores to be independently associated with matriculation into general surgery (P = .007). Superior clinical aptitude is independently associated with general surgical matriculation. This is in contrast to the negative correlation United States Medical Licensing Examination Step 1 scores have on general surgery matriculation. Recognizing this, surgical clerkship directors can offer opportunities for continued surgical education to students showing high clinical aptitude, increasing their likelihood of surgical matriculation. Copyright © 2014 Elsevier Inc. All rights reserved.
The design, analysis and experimental evaluation of an elastic model wing
NASA Technical Reports Server (NTRS)
Cavin, R. K., III; Thisayakorn, C.
1974-01-01
An elastic orbiter model was developed to evaluate the effectiveness of aeroelasticity computer programs. The elasticity properties were introduced by constructing beam-like straight wings for the wind tunnel model. A standard influence coefficient mathematical model was used to estimate aeroelastic effects analytically. In general good agreement was obtained between the empirical and analytical estimates of the deformed shape. However, in the static aeroelasticity case, it was found that the physical wing exhibited less bending and more twist than was predicted by theory.
NASA Technical Reports Server (NTRS)
Jeracki, R. J.; Mitchell, G. A.
1981-01-01
The performance of lower speed, 5 foot diameter model general aviation propellers, was tested in the Lewis wind tunnel. Performance was evaluated for various levels of airfoil technology and activity factor. The difference was associated with inadequate modeling of blade and spinner losses for propellers round shank blade designs. Suggested concepts for improvement are: (1) advanced blade shapes (airfoils and sweep); (2) tip devices (proplets); (3) integrated propeller/nacelles; and (4) composites. Several advanced aerodynamic concepts were evaluated in the Lewis wind tunnel. Results show that high propeller performance can be obtained to at least Mach 0.8.
NASA Technical Reports Server (NTRS)
Brooks, George W.
1985-01-01
The options for the design, construction, and testing of a dynamic model of the space station were evaluated. Since the definition of the space station structure is still evolving, the Initial Operating Capacity (IOC) reference configuration was used as the general guideline. The results of the studies treat: general considerations of the need for and use of a dynamic model; factors which deal with the model design and construction; and a proposed system for supporting the dynamic model in the planned Large Spacecraft Laboratory.
On the applicability of integrated circuit technology to general aviation orientation estimation
NASA Technical Reports Server (NTRS)
Debra, D. B.; Tashker, M. G.
1976-01-01
The criteria of the significant value of the panel instruments used in general aviation were examined and kinematic equations were added for comparison. An instrument survey was performed to establish the present state of the art in linear and angular accelerometers, pressure transducers, and magnetometers. A very preliminary evaluation was done of the computers available for data evaluation and estimator mechanization. The mathematical model of a light twin aircraft employed in the evaluation was documented, the results of the sensor survey and the results of the design studies were presented.
SIMULATION MODEL FOR WATERSHED MANAGEMENT PLANNING. VOLUME 2. MODEL USER MANUAL
This report provides a user manual for the hydrologic, nonpoint source pollution simulation of the generalized planning model for evaluating forest and farming management alternatives. The manual contains an explanation of application of specific code and indicates changes that s...
A full year evaluation of the CALIOPE-EU air quality modeling system over Europe for 2004
NASA Astrophysics Data System (ADS)
Pay, M. T.; Piot, M.; Jorba, O.; Gassó, S.; Gonçalves, M.; Basart, S.; Dabdub, D.; Jiménez-Guerrero, P.; Baldasano, J. M.
The CALIOPE-EU high-resolution air quality modeling system, namely WRF-ARW/HERMES-EMEP/CMAQ/BSC-DREAM8b, is developed and applied to Europe (12 km × 12 km, 1 h). The model performances are tested in terms of air quality levels and dynamics reproducibility on a yearly basis. The present work describes a quantitative evaluation of gas phase species (O 3, NO 2 and SO 2) and particulate matter (PM2.5 and PM10) against ground-based measurements from the EMEP (European Monitoring and Evaluation Programme) network for the year 2004. The evaluation is based on statistics. Simulated O 3 achieves satisfactory performances for both daily mean and daily maximum concentrations, especially in summer, with annual mean correlations of 0.66 and 0.69, respectively. Mean normalized errors are comprised within the recommendations proposed by the United States Environmental Protection Agency (US-EPA). The general trends and daily variations of primary pollutants (NO 2 and SO 2) are satisfactory. Daily mean concentrations of NO 2 correlate well with observations (annual correlation r = 0.67) but tend to be underestimated. For SO 2, mean concentrations are well simulated (mean bias = 0.5 μg m -3) with relatively high annual mean correlation ( r = 0.60), although peaks are generally overestimated. The dynamics of PM2.5 and PM10 is well reproduced (0.49 < r < 0.62), but mean concentrations remain systematically underestimated. Deficiencies in particulate matter source characterization are discussed. Also, the spatially distributed statistics and the general patterns for each pollutant over Europe are examined. The model performances are compared with other European studies. While O 3 statistics generally remain lower than those obtained by the other considered studies, statistics for NO 2, SO 2, PM2.5 and PM10 present higher scores than most models.
Natural hazard modeling and uncertainty analysis [Chapter 2
Matthew Thompson; Jord J. Warmink
2017-01-01
Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...
Communications, Navigation, and Surveillance Models in ACES: Design Implementation and Capabilities
NASA Technical Reports Server (NTRS)
Kubat, Greg; Vandrei, Don; Satapathy, Goutam; Kumar, Anil; Khanna, Manu
2006-01-01
Presentation objectives include: a) Overview of the ACES/CNS System Models Design and Integration; b) Configuration Capabilities available for Models and Simulations using ACES with CNS Modeling; c) Descriptions of recently added, Enhanced CNS Simulation Capabilities; and d) General Concepts Ideas that Utilize CNS Modeling to Enhance Concept Evaluations.
Role of Animal Models in Coronary Stenting.
Iqbal, Javaid; Chamberlain, Janet; Francis, Sheila E; Gunn, Julian
2016-02-01
Coronary angioplasty initially employed balloon dilatation only. This technique revolutionized the treatment of coronary artery disease, although outcomes were compromised by acute vessel closure, late constrictive remodeling, and restenosis due to neointimal proliferation. These processes were studied in animal models, which contributed to understanding the biology of endovascular arterial injury. Coronary stents overcome acute recoil, with improvements in the design and metallurgy since then, leading to the development of drug-eluting stents and bioresorbable scaffolds. These devices now undergo computer modeling and benchtop and animal testing before evaluation in clinical trials. Animal models, including rabbit, sheep, dog and pig are available, all with individual benefits and limitations. In smaller mammals, such as mouse and rabbit, the target for stenting is generally the aorta; whereas in larger animals, such as the pig, it is generally the coronary artery. The pig coronary stenting model is a gold-standard for evaluating safety; but insights into biomechanical properties, the biology of stenting, and efficacy in controlling neointimal proliferation can also be gained. Intra-coronary imaging modalities such as intravascular ultrasound and optical coherence tomography allow precise serial evaluation in vivo, and recent developments in genetically modified animal models of atherosclerosis provide realistic test beds for future stents and scaffolds.
Patients' evaluations of European general practice--revisited after 11 years.
Petek, Davorina; Künzi, Beat; Kersnik, Janko; Szecsenyi, Joachim; Wensing, Michel
2011-12-01
In the last decade many things have changed in healthcare systems, primary care practices and populations. To describe evaluations of general practice care by patients with a chronic illness in 2009 and compare these with a previous study done in 1998. A descriptive analysis of patients' evaluations, using data from the European practice assessment Cardio study on cardiovascular patients in eight European countries in 2009. We compared these evaluations with a subgroup of patients with self-defined chronic illness from the study in 1998, using a linear regression model. Patients' evaluation of general practice using the EUROPEP questionnaire. The EUROPEP is a 23-item validated measure of patient evaluations of general practice care. In 2009, 7472 patients from 251 practices participated in the study with an overall response rate of 49.6%. The percentage of patients with positive evaluations (good/excellent) was 80% or higher for all items, except for the waiting time. More positive evaluations were found in older patients, patients with a longer attachment to the practice, patients with a higher self-evaluation of their health, patients with fewer mental health problems and less pain/discomfort. The comparison between 1998 and 2009 showed no overall trends for all countries combined. Whereas English patients became fairly more positive about general practice in 2009, German patients became slightly less positive, although still more positive than English patients. Overall, the patients' evaluations of general practice were very positive in family practice care in the years 1998 and 2009. The trends over the years need to be carefully interpreted over time.
ERIC Educational Resources Information Center
Kavgaoglu, Derya; Alci, Bülent
2016-01-01
The goal of this research which was carried out in reputable dedicated call centres within the Turkish telecommunication sector aims is to evaluate competence-based curriculums designed by means of internal funding through Stufflebeam's context, input, process, product (CIPP) model. In the research, a general scanning pattern in the scope of…
ERIC Educational Resources Information Center
Portowitz, Adena; Peppler, Kylie A.; Downton, Mike
2014-01-01
This article reports on the practice and evaluation of a music education model, In Harmony, which utilizes new technologies and current theories of learning to mediate the music learning experience. In response to the needs of twenty-first century learners, the educational software programs Teach, Learn, Evaluate! and Impromptu served as central…
ERIC Educational Resources Information Center
Hsiao, Yu-Yu; Kwok, Oi-Man; Lai, Mark H. C.
2018-01-01
Path models with observed composites based on multiple items (e.g., mean or sum score of the items) are commonly used to test interaction effects. Under this practice, researchers generally assume that the observed composites are measured without errors. In this study, we reviewed and evaluated two alternative methods within the structural…
Blangiardo, Marta; Finazzi, Francesco; Cameletti, Michela
2016-08-01
Exposure to high levels of air pollutant concentration is known to be associated with respiratory problems which can translate into higher morbidity and mortality rates. The link between air pollution and population health has mainly been assessed considering air quality and hospitalisation or mortality data. However, this approach limits the analysis to individuals characterised by severe conditions. In this paper we evaluate the link between air pollution and respiratory diseases using general practice drug prescriptions for chronic respiratory diseases, which allow to draw conclusions based on the general population. We propose a two-stage statistical approach: in the first stage we specify a space-time model to estimate the monthly NO2 concentration integrating several data sources characterised by different spatio-temporal resolution; in the second stage we link the concentration to the β2-agonists prescribed monthly by general practices in England and we model the prescription rates through a small area approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Structural Equation Model of HIV-related Symptoms, Depressive Symptoms, and Medication Adherence.
Yoo-Jeong, Moka; Waldrop-Valverde, Drenna; McCoy, Katryna; Ownby, Raymond L
2016-05-01
Adherence to combined antiretroviral therapy (cART) remains critical in management of HIV infection. This study evaluated depression as a potential mechanism by which HIV-related symptoms affect medication adherence and explored if particular clusters of HIV symptoms are susceptible to this mechanism. Baseline data from a multi-visit intervention study were analyzed among 124 persons living with HIV (PLWH). A bifactor model showed two clusters of HIV-related symptom distress: general HIV-related symptoms and gastrointestinal (GI) symptoms. Structural equation modeling showed that both general HIV-related symptoms and GI symptoms were related to higher levels of depressive symptoms, and higher levels of depressive symptoms were related to lower levels of medication adherence. Although general HIV-related symptoms and GI symptoms were not directly related to adherence, they were indirectly associated with adherence via depression. The findings highlight the importance of early recognition and evaluation of symptoms of depression, as well as the underlying physical symptoms that might cause depression, to improve medication adherence.
A Structural Equation Model of HIV-related Symptoms, Depressive Symptoms, and Medication Adherence
Yoo-Jeong, Moka; Waldrop-Valverde, Drenna; McCoy, Katryna; Ownby, Raymond L
2016-01-01
Adherence to combined antiretroviral therapy (cART) remains critical in management of HIV infection. This study evaluated depression as a potential mechanism by which HIV-related symptoms affect medication adherence and explored if particular clusters of HIV symptoms are susceptible to this mechanism. Baseline data from a multi-visit intervention study were analyzed among 124 persons living with HIV (PLWH). A bifactor model showed two clusters of HIV-related symptom distress: general HIV-related symptoms and gastrointestinal (GI) symptoms. Structural equation modeling showed that both general HIV-related symptoms and GI symptoms were related to higher levels of depressive symptoms, and higher levels of depressive symptoms were related to lower levels of medication adherence. Although general HIV-related symptoms and GI symptoms were not directly related to adherence, they were indirectly associated with adherence via depression. The findings highlight the importance of early recognition and evaluation of symptoms of depression, as well as the underlying physical symptoms that might cause depression, to improve medication adherence. PMID:27695710
NASA Astrophysics Data System (ADS)
Adebiyi, S. J.; Adebesin, B. O.; Ikubanni, S. O.; Joshua, B. W.
2017-05-01
Empirical models of the ionosphere, such as the International Reference Ionosphere (IRI) model, play a vital role in evaluating the environmental effect on the operation of space-based communication and navigation technologies. The IRI extended to Plasmasphere (IRI-Plas) model can be adjusted with external data to update its electron density profile while still maintaining the overall integrity of the model representations. In this paper, the performance of the total electron content (TEC) assimilation option of the IRI-Plas at two equatorial stations, Jicamarca, Peru (geographic: 12°S, 77°W, dip angle 0.8°) and Cachoeira Paulista, Brazil (Geographic: 22.7°S, 45°W, dip angle -26°), is examined during quiet and disturbed conditions. TEC, F2 layer critical frequency (foF2), and peak height (hmF2) predicted when the model is operated without external input were used as a baseline in our model evaluation. Results indicate that TEC predicted by the assimilation option generally produced smaller estimation errors compared to the "no extra input" option during quiet and disturbed conditions. Generally, the error is smaller at the equatorial trough than near the crest for both quiet and disturbed days. With assimilation option, there is a substantial improvement of storm time estimations when compared with quiet time predictions. The improvement is, however, independent on storm's severity. Furthermore, the modeled foF2 and hmF2 are generally poor with TEC assimilation, particularly the hmF2 prediction, at the two locations during both quiet and disturbed conditions. Consequently, IRI-Plas model assimilated with TEC value only may not be sufficient where more realistic instantaneous values of peak parameters are required.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Raymond H.; Truax, Ryan A.; Lankford, David A.
Solid-phase iron concentrations and generalized composite surface complexation models were used to evaluate procedures in determining uranium sorption on oxidized aquifer material at a proposed U in situ recovery (ISR) site. At the proposed Dewey Burdock ISR site in South Dakota, USA, oxidized aquifer material occurs downgradient of the U ore zones. Solid-phase Fe concentrations did not explain our batch sorption test results,though total extracted Fe appeared to be positively correlated with overall measured U sorption. Batch sorption test results were used to develop generalized composite surface complexation models that incorporated the full genericsorption potential of each sample, without detailedmore » mineralogiccharacterization. The resultant models provide U sorption parameters (site densities and equilibrium constants) for reactive transport modeling. The generalized composite surface complexation sorption models were calibrated to batch sorption data from three oxidized core samples using inverse modeling, and gave larger sorption parameters than just U sorption on the measured solidphase Fe. These larger sorption parameters can significantly influence reactive transport modeling, potentially increasing U attenuation. Because of the limited number of calibration points, inverse modeling required the reduction of estimated parameters by fixing two parameters. The best-fit models used fixed values for equilibrium constants, with the sorption site densities being estimated by the inversion process. While these inverse routines did provide best-fit sorption parameters, local minima and correlated parameters might require further evaluation. Despite our limited number of proxy samples, the procedures presented provide a valuable methodology to consider for sites where metal sorption parameters are required. Furthermore, these sorption parameters can be used in reactive transport modeling to assess downgradient metal attenuation, especially when no other calibration data are available, such as at proposed U ISR sites.« less
Johnson, Raymond H.; Truax, Ryan A.; Lankford, David A.; ...
2016-02-03
Solid-phase iron concentrations and generalized composite surface complexation models were used to evaluate procedures in determining uranium sorption on oxidized aquifer material at a proposed U in situ recovery (ISR) site. At the proposed Dewey Burdock ISR site in South Dakota, USA, oxidized aquifer material occurs downgradient of the U ore zones. Solid-phase Fe concentrations did not explain our batch sorption test results,though total extracted Fe appeared to be positively correlated with overall measured U sorption. Batch sorption test results were used to develop generalized composite surface complexation models that incorporated the full genericsorption potential of each sample, without detailedmore » mineralogiccharacterization. The resultant models provide U sorption parameters (site densities and equilibrium constants) for reactive transport modeling. The generalized composite surface complexation sorption models were calibrated to batch sorption data from three oxidized core samples using inverse modeling, and gave larger sorption parameters than just U sorption on the measured solidphase Fe. These larger sorption parameters can significantly influence reactive transport modeling, potentially increasing U attenuation. Because of the limited number of calibration points, inverse modeling required the reduction of estimated parameters by fixing two parameters. The best-fit models used fixed values for equilibrium constants, with the sorption site densities being estimated by the inversion process. While these inverse routines did provide best-fit sorption parameters, local minima and correlated parameters might require further evaluation. Despite our limited number of proxy samples, the procedures presented provide a valuable methodology to consider for sites where metal sorption parameters are required. Furthermore, these sorption parameters can be used in reactive transport modeling to assess downgradient metal attenuation, especially when no other calibration data are available, such as at proposed U ISR sites.« less
Transportation Impact Evaluation System
DOT National Transportation Integrated Search
1979-11-01
This report specifies a framework for spatial analysis and the general modelling steps required. It also suggests available urban and regional data sources, along with some typical existing urban and regional models. The goal is to develop a computer...
NASA Technical Reports Server (NTRS)
Bundick, W. T.
1985-01-01
The application of the Generalized Likelihood Ratio technique to the detection and identification of aircraft control element failures has been evaluated in a linear digital simulation of the longitudinal dynamics of a B-737 aircraft. Simulation results show that the technique has potential but that the effects of wind turbulence and Kalman filter model errors are problems which must be overcome.
Detailed model for practical pulverized coal furnaces and gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, P.J.; Smoot, L.D.
1989-08-01
This study has been supported by a consortium of nine industrial and governmental sponsors. Work was initiated on May 1, 1985 and completed August 31, 1989. The central objective of this work was to develop, evaluate and apply a practical combustion model for utility boilers, industrial furnaces and gasifiers. Key accomplishments have included: Development of an advanced first-generation, computer model for combustion in three dimensional furnaces; development of a new first generation fouling and slagging submodel; detailed evaluation of an existing NO{sub x} submodel; development and evaluation of an improved radiation submodel; preparation and distribution of a three-volume final report:more » (a) Volume 1: General Technical Report; (b) Volume 2: PCGC-3 User's Manual; (c) Volume 3: Data Book for Evaluation of Three-Dimensional Combustion Models; and organization of a user's workshop on the three-dimensional code. The furnace computer model developed under this study requires further development before it can be applied generally to all applications; however, it can be used now by specialists for many specific applications, including non-combusting systems and combusting geseous systems. A new combustion center was organized and work was initiated to continue the important research effort initiated by this study. 212 refs., 72 figs., 38 tabs.« less
ERIC Educational Resources Information Center
Chan, James L.; Snyder, Gerald E.
Ways in which the external financial disclosures by universities may evaluate institutional economic viability are demonstrated. It is argued that the evaluation should take into account the effect of inflation and activity level. The evaluation model requires several years' information about revenues (general operating fund), the impact of…
Guisan, Antoine; Edwards, T.C.; Hastie, T.
2002-01-01
An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.
A Systematic Review of Health Economics Simulation Models of Chronic Obstructive Pulmonary Disease.
Zafari, Zafar; Bryan, Stirling; Sin, Don D; Conte, Tania; Khakban, Rahman; Sadatsafavi, Mohsen
2017-01-01
Many decision-analytic models with varying structures have been developed to inform resource allocation in chronic obstructive pulmonary disease (COPD). To review COPD models for their adherence to the best practice modeling recommendations and their assumptions regarding important aspects of the natural history of COPD. A systematic search of English articles reporting on the development or application of a decision-analytic model in COPD was performed in MEDLINE, Embase, and citations within reviewed articles. Studies were summarized and evaluated on the basis of their adherence to the Consolidated Health Economic Evaluation Reporting Standards. They were also evaluated for the underlying assumptions about disease progression, heterogeneity, comorbidity, and treatment effects. Forty-nine models of COPD were included. Decision trees and Markov models were the most popular techniques (43 studies). Quality of reporting and adherence to the guidelines were generally high, especially in more recent publications. Disease progression was modeled through clinical staging in most studies. Although most studies (n = 43) had incorporated some aspects of COPD heterogeneity, only 8 reported the results across subgroups. Only 2 evaluations explicitly considered the impact of comorbidities. Treatment effect had been mostly modeled (20) as both reduction in exacerbation rate and improvement in lung function. Many COPD models have been developed, generally with similar structural elements. COPD is highly heterogeneous, and comorbid conditions play an important role in its burden. These important aspects, however, have not been adequately addressed in most of the published models. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Towards a Generalizable Time Expression Model for Temporal Reasoning in Clinical Notes
Velupillai, Sumithra; Mowery, Danielle L.; Abdelrahman, Samir; Christensen, Lee; Chapman, Wendy W
2015-01-01
Accurate temporal identification and normalization is imperative for many biomedical and clinical tasks such as generating timelines and identifying phenotypes. A major natural language processing challenge is developing and evaluating a generalizable temporal modeling approach that performs well across corpora and institutions. Our long-term goal is to create such a model. We initiate our work on reaching this goal by focusing on temporal expression (TIMEX3) identification. We present a systematic approach to 1) generalize existing solutions for automated TIMEX3 span detection, and 2) assess similarities and differences by various instantiations of TIMEX3 models applied on separate clinical corpora. When evaluated on the 2012 i2b2 and the 2015 Clinical TempEval challenge corpora, our conclusion is that our approach is successful – we achieve competitive results for automated classification, and we identify similarities and differences in TIMEX3 modeling that will be informative in the development of a simplified, general temporal model. PMID:26958265
Gomez, Rapson; Watson, Shaun D
2017-01-01
For the Social Phobia Scale (SPS) and the Social Interaction Anxiety Scale (SIAS) together, this study examined support for a bifactor model, and also the internal consistency reliability and external validity of the factors in this model. Participants ( N = 526) were adults from the general community who completed the SPS and SIAS. Confirmatory factor analysis (CFA) of their ratings indicated good support for the bifactor model. For this model, the loadings for all but six items were higher on the general factor than the specific factors. The three positively worded items had negligible loadings on the general factor. The general factor explained most of the common variance in the SPS and SIAS, and demonstrated good model-based internal consistency reliability (omega hierarchical) and a strong association with fear of negative evaluation and extraversion. The practical implications of the findings for the utilization of the SPS and SIAS, and the theoretical and clinical implications for social anxiety are discussed.
Gomez, Rapson; Watson, Shaun D.
2017-01-01
For the Social Phobia Scale (SPS) and the Social Interaction Anxiety Scale (SIAS) together, this study examined support for a bifactor model, and also the internal consistency reliability and external validity of the factors in this model. Participants (N = 526) were adults from the general community who completed the SPS and SIAS. Confirmatory factor analysis (CFA) of their ratings indicated good support for the bifactor model. For this model, the loadings for all but six items were higher on the general factor than the specific factors. The three positively worded items had negligible loadings on the general factor. The general factor explained most of the common variance in the SPS and SIAS, and demonstrated good model-based internal consistency reliability (omega hierarchical) and a strong association with fear of negative evaluation and extraversion. The practical implications of the findings for the utilization of the SPS and SIAS, and the theoretical and clinical implications for social anxiety are discussed. PMID:28210232
[Impact of a training model for the Child Development Evaluation Test in primary care].
Rizzoli-Córdoba, Antonio; Delgado-Ginebra, Ismael; Cruz-Ortiz, Leopoldo Alfonso; Baqueiro-Hernández, César Iván; Martain-Pérez, Itzamara Jacqueline; Palma-Tavera, Josuha Alexander; Villasís-Keever, Miguel Ángel; Reyes-Morales, Hortensia; O'Shea-Cuevas, Gabriel; Aceves-Villagrán, Daniel; Carrasco-Mendoza, Joaquín; Antillón-Ocampo, Fátima Adriana; Villagrán-Muñoz, Víctor Manuel; Halley-Castillo, Elizabeth; Vargas-López, Guillermo; Muñoz-Hernández, Onofre
The Child Development Evaluation (CDE) Test is a screening tool designed and validated in Mexico for the early detection of child developmental problems. For professionals who will be administering the test in primary care facilities, previous acquisition of knowledge about the test is required in order to generate reliable results. The aim of this work was to evaluate the impact of a training model for primary care workers from different professions through the comparison of knowledge acquired during the training course. The study design was a before/after type considering the participation in a training course for the CDE test as the intervention. The course took place in six different Mexican states from October to December 2013. The same questions were used before and after. There were 394 participants included. Distribution according to professional profile was as follows: general physicians 73.4%, nursing 7.7%, psychology 7.1%, nutrition 6.1% and other professions 5.6%. The questions with the lowest correct answer rates were associated with the scoring of the CDE test. In the initial evaluation, 64.9% obtained a grade lower than 20 compared with 1.8% in the final evaluation. In the initial evaluation only 1.8% passed compared with 75.15% in the final evaluation. The proposed model allows the participants to acquire general knowledge about the CDE Test. To improve the general results in future training courses, it is required to reinforce during training the scoring and interpretation of the test together with the previous lecture of the material by the participants. Copyright © 2015 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.
Precipitation-runoff modeling system; user's manual
Leavesley, G.H.; Lichty, R.W.; Troutman, B.M.; Saindon, L.G.
1983-01-01
The concepts, structure, theoretical development, and data requirements of the precipitation-runoff modeling system (PRMS) are described. The precipitation-runoff modeling system is a modular-design, deterministic, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow, sediment yields, and general basin hydrology. Basin response to normal and extreme rainfall and snowmelt can be simulated to evaluate changes in water balance relationships, flow regimes, flood peaks and volumes, soil-water relationships, sediment yields, and groundwater recharge. Parameter-optimization and sensitivity analysis capabilites are provided to fit selected model parameters and evaluate their individual and joint effects on model output. The modular design provides a flexible framework for continued model system enhancement and hydrologic modeling research and development. (Author 's abstract)
A Model of Instructional Supervision That Meets Today's Needs.
ERIC Educational Resources Information Center
Beck, John J.; Seifert, Edward H.
1983-01-01
The proposed Instructional Technologist Model is based on a closed loop feedback system allowing for continuous monitoring of teachers by expert instructional technologists. Principals are thereby released for instructional evaluation and general educational management. (MJL)
Nie, Z Q; Ou, Y Q; Zhuang, J; Qu, Y J; Mai, J Z; Chen, J M; Liu, X Q
2016-05-01
Conditional logistic regression analysis and unconditional logistic regression analysis are commonly used in case control study, but Cox proportional hazard model is often used in survival data analysis. Most literature only refer to main effect model, however, generalized linear model differs from general linear model, and the interaction was composed of multiplicative interaction and additive interaction. The former is only statistical significant, but the latter has biological significance. In this paper, macros was written by using SAS 9.4 and the contrast ratio, attributable proportion due to interaction and synergy index were calculated while calculating the items of logistic and Cox regression interactions, and the confidence intervals of Wald, delta and profile likelihood were used to evaluate additive interaction for the reference in big data analysis in clinical epidemiology and in analysis of genetic multiplicative and additive interactions.
Real-time simulation of biological soft tissues: a PGD approach.
Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F
2013-05-01
We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Kozlovská, Mária; Struková, Zuzana
2013-06-01
Several factors should be considered by the owner and general contractor in the process of contractors` and subcontractors` selection and evaluation. The paper reviews the recent models addressed to guide general contractors in subcontractors' selection process and in evaluation of different contractors during the execution of the project. Moreover the paper suggests the impact of different contractors' performance to the overall level of occupational health and safety culture at the sites. It deals with the factors influencing the safety performance of contractors during construction and analyses the methods for assessing the safety performance of construction contractors. The results of contractors' safety performance evaluation could be a useful tool in motivating contractors to achieve better safety outcomes or could have effect on owners` or general contractors' decision making about contractors suitability for future contracting works.
Karnon, Jonathan; Haji Ali Afzali, Hossein
2014-06-01
Modelling in economic evaluation is an unavoidable fact of life. Cohort-based state transition models are most common, though discrete event simulation (DES) is increasingly being used to implement more complex model structures. The benefits of DES relate to the greater flexibility around the implementation and population of complex models, which may provide more accurate or valid estimates of the incremental costs and benefits of alternative health technologies. The costs of DES relate to the time and expertise required to implement and review complex models, when perhaps a simpler model would suffice. The costs are not borne solely by the analyst, but also by reviewers. In particular, modelled economic evaluations are often submitted to support reimbursement decisions for new technologies, for which detailed model reviews are generally undertaken on behalf of the funding body. This paper reports the results from a review of published DES-based economic evaluations. Factors underlying the use of DES were defined, and the characteristics of applied models were considered, to inform options for assessing the potential benefits of DES in relation to each factor. Four broad factors underlying the use of DES were identified: baseline heterogeneity, continuous disease markers, time varying event rates, and the influence of prior events on subsequent event rates. If relevant, individual-level data are available, representation of the four factors is likely to improve model validity, and it is possible to assess the importance of their representation in individual cases. A thorough model performance evaluation is required to overcome the costs of DES from the users' perspective, but few of the reviewed DES models reported such a process. More generally, further direct, empirical comparisons of complex models with simpler models would better inform the benefits of DES to implement more complex models, and the circumstances in which such benefits are most likely.
Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L
2018-04-01
Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.
A School Finance Computer Simulation Model
ERIC Educational Resources Information Center
Boardman, Gerald R.
1974-01-01
Presents a description of the computer simulation model developed by the National Educational Finance Project for use by States in planning and evaluating alternative approaches for State support programs. Provides a general introduction to the model, a program operation overview, a sample run, and some conclusions. (Author/WM)
Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...
G. Thirel; V. Andreassian; C. Perrin; J.-N. Audouy; L. Berthet; Pamela Edwards; N. Folton; C. Furusho; A. Kuentz; J. Lerat; G. Lindstrom; E. Martin; T. Mathevet; R. Merz; J. Parajka; D. Ruelland; J. Vaze
2015-01-01
Testing hydrological models under changing conditions is essential to evaluate their ability to cope with changing catchments and their suitability for impact studies. With this perspective in mind, a workshop dedicated to this issue was held at the 2013 General Assembly of the International Association of Hydrological Sciences (IAHS) in Göteborg, Sweden, in July 2013...
ERIC Educational Resources Information Center
Osmanoglu, Hasan; Üzüm, Hanifi
2018-01-01
The purpose of this study was to evaluate the service quality of the hotels which are provided sport tourism by athletes according to some variables. The research was conducted with cross-sectional research method as one of the general survey models and relational screening model. Target group of the study also constituted the sample group. This…
The Cattell-Horn-Carroll Model of Cognition for Clinical Assessment
ERIC Educational Resources Information Center
Jewsbury, Paul A.; Bowden, Stephen C.; Duff, Kevin
2017-01-01
The Cattell-Horn-Carroll (CHC) model is a comprehensive model of the major dimensions of individual differences that underlie performance on cognitive tests. Studies evaluating the generality of the CHC model across test batteries, age, gender, and culture were reviewed and found to be overwhelmingly supportive. However, less research is available…
Generalized math model for simulation of high-altitude balloon systems
NASA Technical Reports Server (NTRS)
Nigro, N. J.; Elkouh, A. F.; Hinton, D. E.; Yang, J. K.
1985-01-01
Balloon systems have proved to be a cost-effective means for conducting research experiments (e.g., infrared astronomy) in the earth's atmosphere. The purpose of this paper is to present a generalized mathematical model that can be used to simulate the motion of these systems once they have attained float altitude. The resulting form of the model is such that the pendulation and spin motions of the system are uncoupled and can be analyzed independently. The model is evaluated by comparing the simulation results with data obtained from an actual balloon system flown by NASA.
Evaluation methodology for query-based scene understanding systems
NASA Astrophysics Data System (ADS)
Huster, Todd P.; Ross, Timothy D.; Culbertson, Jared L.
2015-05-01
In this paper, we are proposing a method for the principled evaluation of scene understanding systems in a query-based framework. We can think of a query-based scene understanding system as a generalization of typical sensor exploitation systems where instead of performing a narrowly defined task (e.g., detect, track, classify, etc.), the system can perform general user-defined tasks specified in a query language. Examples of this type of system have been developed as part of DARPA's Mathematics of Sensing, Exploitation, and Execution (MSEE) program. There is a body of literature on the evaluation of typical sensor exploitation systems, but the open-ended nature of the query interface introduces new aspects to the evaluation problem that have not been widely considered before. In this paper, we state the evaluation problem and propose an approach to efficiently learn about the quality of the system under test. We consider the objective of the evaluation to be to build a performance model of the system under test, and we rely on the principles of Bayesian experiment design to help construct and select optimal queries for learning about the parameters of that model.
Evaluation of generalized degrees of freedom for sparse estimation by replica method
NASA Astrophysics Data System (ADS)
Sakata, A.
2016-12-01
We develop a method to evaluate the generalized degrees of freedom (GDF) for linear regression with sparse regularization. The GDF is a key factor in model selection, and thus its evaluation is useful in many modelling applications. An analytical expression for the GDF is derived using the replica method in the large-system-size limit with random Gaussian predictors. The resulting formula has a universal form that is independent of the type of regularization, providing us with a simple interpretation. Within the framework of replica symmetric (RS) analysis, GDF has a physical meaning as the effective fraction of non-zero components. The validity of our method in the RS phase is supported by the consistency of our results with previous mathematical results. The analytical results in the RS phase are calculated numerically using the belief propagation algorithm.
Automation of Ocean Product Metrics
2008-09-30
Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data
Halliwell, Emma; Dittmar, Helga
2005-09-01
This study investigates the effect of social comparisons with media models on women's body image based on either self-evaluation or self-improvement motives. Ninety-eight women, for whom appearance was a relevant comparison dimension, viewed advertisements that did, or did not, feature idealised models, after being prompted to engage in self-evaluation or self-improvement comparisons. The results indicate that, when focusing on self-evaluation, comparisons with thin models are associated with higher body-focused anxiety than viewing no model advertisements. In contrast, when focusing on self-improvement, comparisons with thin models are not associated with higher body-focused anxiety than viewing no models. Furthermore, women's general tendency to engage in social comparisons moderated the effects of self-evaluative comparisons with models, so that women who did not habitually engage in social comparisons were most strongly affected. It is suggested that motive for social comparison may explain previous inconsistencies in the experimental exposure literature and warrants more careful attention in future research.
Effects of viscous pressure on warm inflationary generalized cosmic Chaplygin gas model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharif, M.; Saleem, Rabia, E-mail: msharif.math@pu.edu.pk, E-mail: rabiasaleem1988@yahoo.com
This paper is devoted to study the effects of bulk viscous pressure on an inflationary generalized cosmic Chaplygin gas model using FRW background. The matter contents of the universe are assumed to be inflaton and imperfect fluid. We evaluate inflaton fields, potentials and entropy density for variable as well as constant dissipation and bulk viscous coefficients in weak as well as high dissipative regimes during intermediate era. In order to discuss inflationary perturbations, we evaluate entropy density, scalar (tensor) power spectra, their corresponding spectral indices, tensor-scalar ratio and running of spectral index in terms of inflaton which are constrained usingmore » recent Planck, WMAP7 and Bicep2 probes.« less
NASA Astrophysics Data System (ADS)
Wi, S.; Freeman, S.; Brown, C.
2017-12-01
This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.
The three principles of action: a Pavlovian-instrumental transfer hypothesis
Cartoni, Emilio; Puglisi-Allegra, Stefano; Baldassarre, Gianluca
2013-01-01
Pavlovian conditioned stimuli can influence instrumental responding, an effect called Pavlovian-instrumental transfer (PIT). During the last decade, PIT has been subdivided into two types: specific PIT and general PIT, each having its own neural substrates. Specific PIT happens when a conditioned stimulus (CS) associated with a reward enhances an instrumental response directed to the same reward. Under general PIT, instead, the CS enhances a response directed to a different reward. While important progress has been made into identifying the neural substrates, the function of specific and general PIT and how they interact with instrumental responses are still not clear. In the experimental paradigm that distinguishes specific and general PIT an effect of PIT inhibition has also been observed and is waiting for an explanation. Here we propose an hypothesis that links these three PIT effects (specific PIT, general PIT and PIT inhibition) to three aspects of action evaluation. These three aspects, which we call “principles of action”, are: context, efficacy, and utility. In goal-directed behavior, an agent has to evaluate if the context is suitable to accomplish the goal, the efficacy of his action in getting the goal, and the utility of the goal itself: we suggest that each of the three PIT effects is related to one of these aspects of action evaluation. In particular, we link specific PIT with the estimation of efficacy, general PIT with the evaluation of utility, and PIT inhibition with the adequacy of context. We also provide a latent cause Bayesian computational model that exemplifies this hypothesis. This hypothesis and the model provide a new framework and new predictions to advance knowledge about PIT functioning and its role in animal adaptation. PMID:24312025
Evaluation of one dimensional analytical models for vegetation canopies
NASA Technical Reports Server (NTRS)
Goel, Narendra S.; Kuusk, Andres
1992-01-01
The SAIL model for one-dimensional homogeneous vegetation canopies has been modified to include the specular reflectance and hot spot effects. This modified model and the Nilson-Kuusk model are evaluated by comparing the reflectances given by them against those given by a radiosity-based computer model, Diana, for a set of canopies, characterized by different leaf area index (LAI) and leaf angle distribution (LAD). It is shown that for homogeneous canopies, the analytical models are generally quite accurate in the visible region, but not in the infrared region. For architecturally realistic heterogeneous canopies of the type found in nature, these models fall short. These shortcomings are quantified.
Relations among storage, yield, and instream flow
NASA Astrophysics Data System (ADS)
Vogel, Richard M.; Sieber, Jack; Archfield, Stacey A.; Smith, Mark P.; Apse, Colin D.; Huber-Lee, Annette
2007-05-01
An extensive literature documents relations between reservoir storage capacity and water supply yield and the properties of instream flow needed to support downstream aquatic ecosystems. However, the literature that evaluates the impact of reservoir operating rules on instream flow properties is limited to a few site-specific studies, and as a result, few general conclusions can be drawn to date. This study adapts the existing generalized water evaluation and planning model (WEAP) to enable general explorations of relations between reservoir storage, instream flow, and water supply yield for a wide class of reservoirs and operating rules. Generalized relationships among these variables document the types of instream flow policies that when combined with drought management strategies, are likely to provide compromise solutions to the ecological and human negotiations for water for different sized reservoir systems. The concept of a seasonal ecodeficit/ecosurplus is introduced for evaluating the impact of reservoir regulation on ecological flow regimes.
ERIC Educational Resources Information Center
Kamis-Gould, Edna; And Others
1991-01-01
A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…
NASA Astrophysics Data System (ADS)
Chen, Miawjane; Yan, Shangyao; Wang, Sin-Siang; Liu, Chiu-Lan
2015-02-01
An effective project schedule is essential for enterprises to increase their efficiency of project execution, to maximize profit, and to minimize wastage of resources. Heuristic algorithms have been developed to efficiently solve the complicated multi-mode resource-constrained project scheduling problem with discounted cash flows (MRCPSPDCF) that characterize real problems. However, the solutions obtained in past studies have been approximate and are difficult to evaluate in terms of optimality. In this study, a generalized network flow model, embedded in a time-precedence network, is proposed to formulate the MRCPSPDCF with the payment at activity completion times. Mathematically, the model is formulated as an integer network flow problem with side constraints, which can be efficiently solved for optimality, using existing mathematical programming software. To evaluate the model performance, numerical tests are performed. The test results indicate that the model could be a useful planning tool for project scheduling in the real world.
Hobbs, Brian P.; Sargent, Daniel J.; Carlin, Bradley P.
2014-01-01
Assessing between-study variability in the context of conventional random-effects meta-analysis is notoriously difficult when incorporating data from only a small number of historical studies. In order to borrow strength, historical and current data are often assumed to be fully homogeneous, but this can have drastic consequences for power and Type I error if the historical information is biased. In this paper, we propose empirical and fully Bayesian modifications of the commensurate prior model (Hobbs et al., 2011) extending Pocock (1976), and evaluate their frequentist and Bayesian properties for incorporating patient-level historical data using general and generalized linear mixed regression models. Our proposed commensurate prior models lead to preposterior admissible estimators that facilitate alternative bias-variance trade-offs than those offered by pre-existing methodologies for incorporating historical data from a small number of historical studies. We also provide a sample analysis of a colon cancer trial comparing time-to-disease progression using a Weibull regression model. PMID:24795786
Inquiry-Oriented Learning Material to Increased General Physics Competence Achievement
ERIC Educational Resources Information Center
Sinuraya, Jurubahasa
2016-01-01
This study aims to produce inquiry-oriented general physics learning material to improve student learning outcome. Development steps of learning materials were adapted from the design model of Dick and Carey. Stages of development consists of three phases: planning, development, and formative evaluation and revision. Implementation of formative…
ERIC Educational Resources Information Center
Haebara, Tomokazu
When several ability scales in item response models are separately derived from different test forms administered to different samples of examinees, these scales must be equated to a common scale because their units and origins are arbitrarily determined and generally different from scale to scale. A general method for equating logistic ability…
Yock, Adam D; Rao, Arvind; Dong, Lei; Beadle, Beth M; Garden, Adam S; Kudchadker, Rajat J; Court, Laurence E
2014-05-01
The purpose of this work was to develop and evaluate the accuracy of several predictive models of variation in tumor volume throughout the course of radiation therapy. Nineteen patients with oropharyngeal cancers were imaged daily with CT-on-rails for image-guided alignment per an institutional protocol. The daily volumes of 35 tumors in these 19 patients were determined and used to generate (1) a linear model in which tumor volume changed at a constant rate, (2) a general linear model that utilized the power fit relationship between the daily and initial tumor volumes, and (3) a functional general linear model that identified and exploited the primary modes of variation between time series describing the changing tumor volumes. Primary and nodal tumor volumes were examined separately. The accuracy of these models in predicting daily tumor volumes were compared with those of static and linear reference models using leave-one-out cross-validation. In predicting the daily volume of primary tumors, the general linear model and the functional general linear model were more accurate than the static reference model by 9.9% (range: -11.6%-23.8%) and 14.6% (range: -7.3%-27.5%), respectively, and were more accurate than the linear reference model by 14.2% (range: -6.8%-40.3%) and 13.1% (range: -1.5%-52.5%), respectively. In predicting the daily volume of nodal tumors, only the 14.4% (range: -11.1%-20.5%) improvement in accuracy of the functional general linear model compared to the static reference model was statistically significant. A general linear model and a functional general linear model trained on data from a small population of patients can predict the primary tumor volume throughout the course of radiation therapy with greater accuracy than standard reference models. These more accurate models may increase the prognostic value of information about the tumor garnered from pretreatment computed tomography images and facilitate improved treatment management.
Durkin, Michael J; Feng, Qianxi; Warren, Kyle; Lockhart, Peter B; Thornhill, Martin H; Munshi, Kiraat D; Henderson, Rochelle R; Hsueh, Kevin; Fraser, Victoria J
2018-05-01
The purpose of this study was to assess dental antibiotic prescribing trends over time, to quantify the number and types of antibiotics dentists prescribe inappropriately, and to estimate the excess health care costs of inappropriate antibiotic prescribing with the use of a large cohort of general dentists in the United States. We used a quasi-Poisson regression model to analyze antibiotic prescriptions trends by general dentists between January 1, 2013, and December 31, 2015, with the use of data from Express Scripts Holding Company, a large pharmacy benefits manager. We evaluated antibiotic duration and appropriateness for general dentists. Appropriateness was evaluated by reviewing the antibiotic prescribed and the duration of the prescription. Overall, the number and rate of antibiotic prescriptions prescribed by general dentists remained stable in our cohort. During the 3-year study period, approximately 14% of antibiotic prescriptions were deemed inappropriate, based on the antibiotic prescribed, antibiotic treatment duration, or both indicators. The quasi-Poisson regression model, which adjusted for number of beneficiaries covered, revealed a small but statistically significant decrease in the monthly rate of inappropriate antibiotic prescriptions by 0.32% (95% confidence interval, 0.14% to 0.50%; P = .001). Overall antibiotic prescribing practices among general dentists in this cohort remained stable over time. The rate of inappropriate antibiotic prescriptions by general dentists decreased slightly over time. From these authors' definition of appropriate antibiotic prescription choice and duration, inappropriate antibiotic prescriptions are common (14% of all antibiotic prescriptions) among general dentists. Further analyses with the use of chart review, administrative data sets, or other approaches are needed to better evaluate antibiotic prescribing practices among dentists. Copyright © 2018 American Dental Association. Published by Elsevier Inc. All rights reserved.
Shernof, David J.; Ruzek, Erik A.; Sannella, Alexander J.; Schorr, Roberta Y.; Sanchez-Wall, Lina; Bressler, Denise M.
2017-01-01
The purpose of this study was to evaluate a model for considering general and specific elements of student experience in a gateway course in undergraduate Financial Accounting in a large university on the East Coast, USA. Specifically, the study evaluated a bifactor analytic strategy including a general factor of student classroom experience, conceptualized as student engagement as rooted in flow theory, as well as factors representing specific dimensions of experience. The study further evaluated the association between these general and specific factors and both student classroom practices and educational outcomes. The sample of students (N = 407) in two cohorts of the undergraduate financial accounting course participated in the Experience Sampling Method (ESM) measuring students' classroom practices, perceptions, engagement, and perceived learning throughout the one-semester course. Course grade information was also collected. Results showed that a two-level bifactor model fit the data better than two traditional (i.e., non-bifactor) models and also avoided significant multicollinearity of the traditional models. In addition to student engagement (general factor), specific dimensions of classroom experience in the bifactor model at the within-student level included intrinsic motivation, academic intensity, salience, and classroom self-esteem. At the between-student level, specific aspects included work orientation, learning orientation, classroom self-esteem, and disengagement. Multilevel Structural Equation Modeling (MSEM) demonstrated that sitting in the front of the classroom (compared to the sitting in the back), taking notes, active listening, and working on problems during class had a positive effect on within-student variation in student engagement and attention. Engagement, in turn, predicted perceived learning. With respect to between-student effects, the tendency to sit in front seats had a significant effect on student engagement, which in turn had a significant effect on perceived learning and course grades. A significant indirect relationship of seating and active learning strategies on learning and course grade as mediated by student engagement was found. Support for the general aspect of student classroom experience was interpreted with flow theory and suggested the need for additional research. Findings also suggested that active learning strategies are associated with positive learning outcomes even in educational environments where possibilities for action are relatively constrained. PMID:28663733
Shernof, David J; Ruzek, Erik A; Sannella, Alexander J; Schorr, Roberta Y; Sanchez-Wall, Lina; Bressler, Denise M
2017-01-01
The purpose of this study was to evaluate a model for considering general and specific elements of student experience in a gateway course in undergraduate Financial Accounting in a large university on the East Coast, USA. Specifically, the study evaluated a bifactor analytic strategy including a general factor of student classroom experience, conceptualized as student engagement as rooted in flow theory, as well as factors representing specific dimensions of experience. The study further evaluated the association between these general and specific factors and both student classroom practices and educational outcomes. The sample of students ( N = 407) in two cohorts of the undergraduate financial accounting course participated in the Experience Sampling Method (ESM) measuring students' classroom practices, perceptions, engagement, and perceived learning throughout the one-semester course. Course grade information was also collected. Results showed that a two-level bifactor model fit the data better than two traditional (i.e., non-bifactor) models and also avoided significant multicollinearity of the traditional models. In addition to student engagement (general factor), specific dimensions of classroom experience in the bifactor model at the within-student level included intrinsic motivation, academic intensity, salience, and classroom self-esteem. At the between-student level, specific aspects included work orientation, learning orientation, classroom self-esteem, and disengagement. Multilevel Structural Equation Modeling (MSEM) demonstrated that sitting in the front of the classroom (compared to the sitting in the back), taking notes, active listening, and working on problems during class had a positive effect on within-student variation in student engagement and attention. Engagement, in turn, predicted perceived learning. With respect to between-student effects, the tendency to sit in front seats had a significant effect on student engagement, which in turn had a significant effect on perceived learning and course grades. A significant indirect relationship of seating and active learning strategies on learning and course grade as mediated by student engagement was found. Support for the general aspect of student classroom experience was interpreted with flow theory and suggested the need for additional research. Findings also suggested that active learning strategies are associated with positive learning outcomes even in educational environments where possibilities for action are relatively constrained.
Brown, Timothy A.; Naragon-Gainey, Kristin
2013-01-01
The triple vulnerability model (Barlow, 2000, 2002) posits that three vulnerabilities contribute to the etiology of emotional disorders: (1) general biological vulnerability (i.e., dimensions of temperament such as neuroticism and extraversion); (2) general psychological vulnerability (i.e., perceived control over life stress and emotional states); (3) disorder-specific psychological vulnerability (e.g., thought-action fusion for obsessive-compulsive disorder, OCD). Despite the prominence of this model, a comprehensive empirical evaluation has not yet been undertaken. The current study used structural equation modeling to test the triple vulnerability model in a large clinical sample (N = 700), focusing on vulnerabilities for depression, social phobia, generalized anxiety disorder (GAD), and OCD. Specifically, we examined the incremental prediction of each level of the triple vulnerability model for each disorder, with the following putative disorder-specific psychological vulnerabilities: thought-action fusion (TAF) for OCD, the dysfunctional attitudes (DAS) for depression, and intolerance of uncertainty (IoU) for GAD. In the final model that included all three levels of vulnerabilities, neuroticism had significant direct effects on all four disorder constructs, and extraversion was inversely associated with depression and social phobia. However, perceived control was significantly associated with GAD and OCD only. Of the disorder-specific psychological vulnerabilities, TAF was significantly and specifically related to OCD. In contrast, DAS and IoU were not significant predictors of depression and GAD respectively, instead contributing to other disorders. The results are discussed in regard to structural models of the emotional disorders and the various roles of general and specific vulnerability dimensions in the onset, severity, and temporal course of psychopathology. PMID:23611077
Weeks, Justin W
2015-01-01
Wang, Hsu, Chiu, and Liang (2012, Journal of Anxiety Disorders, 26, 215-224) recently proposed a hierarchical model of social interaction anxiety and depression to account for both the commonalities and distinctions between these conditions. In the present paper, this model was extended to more broadly encompass the symptoms of social anxiety disorder, and replicated in a large unselected, undergraduate sample (n = 585). Structural equation modeling (SEM) and hierarchical regression analyses were employed. Negative affect and positive affect were conceptualized as general factors shared by social anxiety and depression; fear of negative evaluation (FNE) and disqualification of positive social outcomes were operationalized as specific factors, and fear of positive evaluation (FPE) was operationalized as a factor unique to social anxiety. This extended hierarchical model explicates structural relationships among these factors, in which the higher-level, general factors (i.e., high negative affect and low positive affect) represent vulnerability markers of both social anxiety and depression, and the lower-level factors (i.e., FNE, disqualification of positive social outcomes, and FPE) are the dimensions of specific cognitive features. Results from SEM and hierarchical regression analyses converged in support of the extended model. FPE is further supported as a key symptom that differentiates social anxiety from depression.
NASA Astrophysics Data System (ADS)
Sokolov, V.; Loh, C. H.; Wen, K. L.
When evaluating the local site influence on seismic ground motion, in certain cases (e.g. building codes provisions) it is sufficient to describe the variety of soil condi- tions by a few number of generalized site classes. The site classification system that is widely used at present is based on on the properties of top 30 m of soil column, dis- regarding the characteristics of the deeper geology. Six site categories are defined on the basis of averaged shear waves velocity, namely: A - hard rock; B - rock; C - very dense or stiff soil; D - stiff soil; E - soft soil; F - soils requiring special studies. The generalized site amplification curves were developed for several site classes in west- ern US (Boore and Joyner, 1997) and Greece (Klimis et al., 1999) using available geotechnical data from near-surface boreholes. We propose to evaluate the amplifica- tion functions as the ratios between the spectra of real earthquakes recordings and the spectra modeled for "very hard rock" (VHR) conditions. The VHR spectra (regional source scaling and attenuation models) are constructed on the basis of ground motion records. The approach allows, on the one hand, to analyze all obtained records. On the other hand, it is possible to test applicability of the used spectral model. Moreover, the uncertainty of site response may be evaluated and described in terms of random variable characteristics to be considered in seismic hazard analysis. The results of the approach application are demonstrated for the case of Taiwan region. The char- acteristics of site amplification functions (mean values and standard deviation) were determined and analyzed in frequency range of 0.2-13 Hz for site classes B and C us- ing recordings of the 1999 Chi-Chi, Taiwan, earthquake (M=7.6), strong aftershocks (M=6.8), and several earthquakes (M < 6.5) occurred in the region in 1995-1998. When comparing the empirical amplification function resulting from the Taiwan data with that proposed for western US, it has been shown that, for both class B and class C, the US amplification functions exhibit lower values than Taiwan class B function for frequencies 1-8 Hz. The Hellenic class C amplification shows, in general, the sim- ilar shape and amplitude as that evaluated for Taiwan region. Thus, the generalized site amplification curves should be also considered as region-dependent functions.
Rosales, Rocío; Gongola, Leah; Homlitas, Christa
2015-01-01
A multiple baseline design across participants was used to evaluate the effects of video modeling with embedded instructions on training teachers to implement 3 preference assessments. Each assessment was conducted with a confederate learner or a child with autism during generalization probes. All teachers met the predetermined mastery criterion, and 2 of the 3 demonstrated skill maintenance at 1-month follow-up.
Modeling and Performance Evaluation of Backoff Misbehaving Nodes in CSMA/CA Networks
2012-08-01
Modeling and Performance Evaluation of Backoff Misbehaving Nodes in CSMA/CA Networks Zhuo Lu, Student Member, IEEE, Wenye Wang, Senior Member, IEEE... misbehaving nodes can obtain, we define and study two general classes of backoff misbehavior: continuous misbehavior, which keeps manipulating the backoff...misbehavior sporadically. Our approach is to introduce a new performance metric, namely order gain, to characterize the performance benefits of misbehaving
ERIC Educational Resources Information Center
Yang, Chongming; Nay, Sandra; Hoyle, Rick H.
2010-01-01
Lengthy scales or testlets pose certain challenges for structural equation modeling (SEM) if all the items are included as indicators of a latent construct. Three general approaches to modeling lengthy scales in SEM (parceling, latent scoring, and shortening) have been reviewed and evaluated. A hypothetical population model is simulated containing…
ERIC Educational Resources Information Center
Feingold, Alan
2009-01-01
The use of growth-modeling analysis (GMA)--including hierarchical linear models, latent growth models, and general estimating equations--to evaluate interventions in psychology, psychiatry, and prevention science has grown rapidly over the last decade. However, an effect size associated with the difference between the trajectories of the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nieves-Chinchilla, T.; Linton, M. G.; Hidalgo, M. A.
We present an analytical model to describe magnetic flux-rope topologies. When these structures are observed embedded in Interplanetary Coronal Mass Ejections (ICMEs) with a depressed proton temperature, they are called Magnetic Clouds (MCs). Our model extends the circular-cylindrical concept of Hidalgo et al. by introducing a general form for the radial dependence of the current density. This generalization provides information on the force distribution inside the flux rope in addition to the usual parameters of MC geometrical information and orientation. The generalized model provides flexibility for implementation in 3D MHD simulations. Here, we evaluate its performance in the reconstruction ofmore » MCs in in situ observations. Four Earth-directed ICME events, observed by the Wind spacecraft, are used to validate the technique. The events are selected from the ICME Wind list with the magnetic obstacle boundaries chosen consistently with the magnetic field and plasma in situ observations and with a new parameter (EPP, the Electron Pitch angle distribution Parameter) which quantifies the bidirectionally of the plasma electrons. The goodness of the fit is evaluated with a single correlation parameter to enable comparative analysis of the events. In general, at first glance, the model fits the selected events very well. However, a detailed analysis of events with signatures of significant compression indicates the need to explore geometries other than the circular-cylindrical. An extension of our current modeling framework to account for such non-circular CMEs will be presented in a forthcoming publication.« less
A general health policy model: update and applications.
Kaplan, R M; Anderson, J P
1988-01-01
This article describes the development of a General Health Policy Model that can be used for program evaluation, population monitoring, clinical research, and policy analysis. An important component of the model, the Quality of Well-being scale (QWB) combines preference-weighted measures of symptoms and functioning to provide a numerical point-in-time expression of well-being, ranging from 0 for death to 1.0 for asymptomatic optimum functioning. The level of wellness at particular points in time is governed by the prognosis (transition rates or probabilities) generated by the underlying disease or injury under different treatment (control) variables. Well-years result from integrating the level of wellness, or health-related quality of life, over the life expectancy. Several issues relevant to the application of the model are discussed. It is suggested that a quality of life measure need not have separate components for social and mental health. Social health has been difficult to define; social support may be a poor criterion for resource allocation; and some evidence suggests that aspects of mental health are captured by the general measure. Although it has been suggested that measures of child health should differ from those used for adults, we argue that a separate conceptualization of child health creates new problems for policy analysis. After offering several applications of the model for the evaluation of prevention programs, we conclude that many of the advantages of general measures have been overlooked and should be given serious consideration in future studies. PMID:3384669
Treatment model in children with speech disorders and its therapeutic efficiency.
Barberena, Luciana; Keske-Soares, Márcia; Cervi, Taís; Brandão, Mariane
2014-07-01
Introduction Speech articulation disorders affect the intelligibility of speech. Studies on therapeutic models show the effectiveness of the communication treatment. Objective To analyze the progress achieved by treatment with the ABAB-Withdrawal and Multiple Probes Model in children with different degrees of phonological disorders. Methods The diagnosis of speech articulation disorder was determined by speech and hearing evaluation and complementary tests. The subjects of this research were eight children, with the average age of 5:5. The children were distributed into four groups according to the degrees of the phonological disorders, based on the percentage of correct consonants, as follows: severe, moderate to severe, mild to moderate, and mild. The phonological treatment applied was the ABAB-Withdrawal and Multiple Probes Model. The development of the therapy by generalization was observed through the comparison between the two analyses: contrastive and distinctive features at the moment of evaluation and reevaluation. Results The following types of generalization were found: to the items not used in the treatment (other words), to another position in the word, within a sound class, to other classes of sounds, and to another syllable structure. Conclusion The different types of generalization studied showed the expansion of production and proper use of therapy-trained targets in other contexts or untrained environments. Therefore, the analysis of the generalizations proved to be an important criterion to measure the therapeutic efficacy.
ERIC Educational Resources Information Center
Piekny, Jeanette; Maehler, Claudia
2013-01-01
According to Klahr's (2000, 2005; Klahr & Dunbar, 1988) Scientific Discovery as Dual Search model, inquiry processes require three cognitive components: hypothesis generation, experimentation, and evidence evaluation. The aim of the present study was to investigate (a) when the ability to evaluate perfect covariation, imperfect covariation,…
Differences in Student Evaluations of Limited-Term Lecturers and Full-Time Faculty
ERIC Educational Resources Information Center
Cho, Jeong-Il; Otani, Koichiro; Kim, B. Joon
2014-01-01
This study compared student evaluations of teaching (SET) for limited-term lecturers (LTLs) and full-time faculty (FTF) using a Likert-scaled survey administered to students (N = 1,410) at the end of university courses. Data were analyzed using a general linear regression model to investigate the influence of multi-dimensional evaluation items on…
ERIC Educational Resources Information Center
Hill, Benjamin D.; Musso, Mandi; Jones, Glenn N.; Pella, Russell D.; Gouvier, Wm. Drew
2013-01-01
A psychometric evaluation on the measurement of self-report anxiety and depression using the Beck Depression Inventory (BDI-II), State Trait Anxiety Inventory, Form-Y (STAI-Y), and the Personality Assessment Inventory (PAI) was performed using a sample of 534 generally young adults seeking psychoeducational evaluation at a university-based clinic.…
Evaluating targeted interventions via meta-population models with multi-level mixing.
Feng, Zhilan; Hill, Andrew N; Curns, Aaron T; Glasser, John W
2017-05-01
Among the several means by which heterogeneity can be modeled, Levins' (1969) meta-population approach preserves the most analytical tractability, a virtue to the extent that generality is desirable. When model populations are stratified, contacts among their respective sub-populations must be described. Using a simple meta-population model, Feng et al. (2015) showed that mixing among sub-populations, as well as heterogeneity in characteristics affecting sub-population reproduction numbers, must be considered when evaluating public health interventions to prevent or control infectious disease outbreaks. They employed the convex combination of preferential within- and proportional among-group contacts first described by Nold (1980) and subsequently generalized by Jacquez et al. (1988). As the utility of meta-population modeling depends on more realistic mixing functions, the authors added preferential contacts between parents and children and among co-workers (Glasser et al., 2012). Here they further generalize this function by including preferential contacts between grandparents and grandchildren, but omit workplace contacts. They also describe a general multi-level mixing scheme, provide three two-level examples, and apply two of them. In their first application, the authors describe age- and gender-specific patterns in face-to-face conversations (Mossong et al., 2008), proxies for contacts by which respiratory pathogens might be transmitted, that are consistent with everyday experience. This suggests that meta-population models with inter-generational mixing could be employed to evaluate prolonged school-closures, a proposed pandemic mitigation measure that could expose grandparents, and other elderly surrogate caregivers for working parents, to infectious children. In their second application, the authors use a meta-population SEIR model stratified by 7 age groups and 50 states plus the District of Columbia, to compare actual with optimal vaccination during the 2009-2010 influenza pandemic in the United States. They also show that vaccination efforts could have been adjusted month-to-month during the fall of 2009 to ensure maximum impact. Such applications inspire confidence in the reliability of meta-population modeling in support of public health policymaking. Published by Elsevier Inc.
Global Gridded Crop Model Evaluation: Benchmarking, Skills, Deficiencies and Implications.
NASA Technical Reports Server (NTRS)
Muller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; Deryng, Delphine; Folberth, Christian; Glotter, Michael; Hoek, Steven;
2017-01-01
Crop models are increasingly used to simulate crop yields at the global scale, but so far there is no general framework on how to assess model performance. Here we evaluate the simulation results of 14 global gridded crop modeling groups that have contributed historic crop yield simulations for maize, wheat, rice and soybean to the Global Gridded Crop Model Intercomparison (GGCMI) of the Agricultural Model Intercomparison and Improvement Project (AgMIP). Simulation results are compared to reference data at global, national and grid cell scales and we evaluate model performance with respect to time series correlation, spatial correlation and mean bias. We find that global gridded crop models (GGCMs) show mixed skill in reproducing time series correlations or spatial patterns at the different spatial scales. Generally, maize, wheat and soybean simulations of many GGCMs are capable of reproducing larger parts of observed temporal variability (time series correlation coefficients (r) of up to 0.888 for maize, 0.673 for wheat and 0.643 for soybean at the global scale) but rice yield variability cannot be well reproduced by most models. Yield variability can be well reproduced for most major producing countries by many GGCMs and for all countries by at least some. A comparison with gridded yield data and a statistical analysis of the effects of weather variability on yield variability shows that the ensemble of GGCMs can explain more of the yield variability than an ensemble of regression models for maize and soybean, but not for wheat and rice. We identify future research needs in global gridded crop modeling and for all individual crop modeling groups. In the absence of a purely observation-based benchmark for model evaluation, we propose that the best performing crop model per crop and region establishes the benchmark for all others, and modelers are encouraged to investigate how crop model performance can be increased. We make our evaluation system accessible to all crop modelers so that other modeling groups can also test their model performance against the reference data and the GGCMI benchmark.
Jarnevich, Catherine S.; Young, Nicholas E; Sheffels, Trevor R.; Carter, Jacoby; Systma, Mark D.; Talbert, Colin
2017-01-01
Invasive species provide a unique opportunity to evaluate factors controlling biogeographic distributions; we can consider introduction success as an experiment testing suitability of environmental conditions. Predicting potential distributions of spreading species is not easy, and forecasting potential distributions with changing climate is even more difficult. Using the globally invasive coypu (Myocastor coypus [Molina, 1782]), we evaluate and compare the utility of a simplistic ecophysiological based model and a correlative model to predict current and future distribution. The ecophysiological model was based on winter temperature relationships with nutria survival. We developed correlative statistical models using the Software for Assisted Habitat Modeling and biologically relevant climate data with a global extent. We applied the ecophysiological based model to several global circulation model (GCM) predictions for mid-century. We used global coypu introduction data to evaluate these models and to explore a hypothesized physiological limitation, finding general agreement with known coypu distribution locally and globally and support for an upper thermal tolerance threshold. Global circulation model based model results showed variability in coypu predicted distribution among GCMs, but had general agreement of increasing suitable area in the USA. Our methods highlighted the dynamic nature of the edges of the coypu distribution due to climate non-equilibrium, and uncertainty associated with forecasting future distributions. Areas deemed suitable habitat, especially those on the edge of the current known range, could be used for early detection of the spread of coypu populations for management purposes. Combining approaches can be beneficial to predicting potential distributions of invasive species now and in the future and in exploring hypotheses of factors controlling distributions.
Traa, Marjan J; Braeken, Johan; De Vries, Jolanda; Roukema, Jan A; Slooter, Gerrit D; Crolla, Rogier M P H; Borremans, Monique P M; Den Oudsten, Brenda L
2015-09-01
This study evaluated the following: (a) levels of sexual, marital, and general life functioning for both patients and partners; (b) interdependence between both members of the couple; and (c) longitudinal change in sexual, marital, and general life functioning and longitudinal stress-spillover effects in these three domains from a dyadic perspective. Couples (n = 102) completed the Maudsley Marital Questionnaire preoperatively and 3 and 6 months postoperatively. Mean scores were compared with norm scores. A multivariate general linear model and a multivariate latent difference score - structural equation modeling (LDS-SEM), which took into account actor and partner effects, were evaluated. Patients and partners reported lower sexual, mostly similar marital, and higher general life functioning compared with norm scores. Moderate to high within-dyad associations were found. The LDS-SEM model mostly showed actor effects. Yet the longitudinal change in the partners' sexual functioning was determined not only by their own preoperative sexual functioning but also by that of the patient. Preoperative sexual functioning did not spill over to the other two domains for patients and partners, whereas the patients' preoperative general life functioning influenced postoperative change in marital and sexual functioning. Health care professionals should examine potential sexual problems but have to be aware that these problems may not spill over to the marital and general life domains. In contrast, low functioning in the general life domain may spill over to the marital and sexual domains. The interdependence between patients and partners implies that a couple-based perspective (e.g., couple-based interventions/therapies) to coping with cancer is needed. Copyright © 2015 John Wiley & Sons, Ltd.
Diagnosing Alzheimer's disease: a systematic review of economic evaluations.
Handels, Ron L H; Wolfs, Claire A G; Aalten, Pauline; Joore, Manuela A; Verhey, Frans R J; Severens, Johan L
2014-03-01
The objective of this study is to systematically review the literature on economic evaluations of interventions for the early diagnosis of Alzheimer's disease (AD) and related disorders and to describe their general and methodological characteristics. We focused on the diagnostic aspects of the decision models to assess the applicability of existing decision models for the evaluation of the recently revised diagnostic research criteria for AD. PubMed and the National Institute for Health Research Economic Evaluation database were searched for English-language publications related to economic evaluations on diagnostic technologies. Trial-based economic evaluations were assessed using the Consensus on Health Economic Criteria list. Modeling studies were assessed using the framework for quality assessment of decision-analytic models. The search retrieved 2109 items, from which eight decision-analytic modeling studies and one trial-based economic evaluation met all eligibility criteria. Diversity among the study objective and characteristics was considerable and, despite considerable methodological quality, several flaws were indicated. Recommendations were focused on diagnostic aspects and the applicability of existing models for the evaluation of recently revised diagnostic research criteria for AD. Copyright © 2014 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Green, Colin; Shearer, James; Ritchie, Craig W; Zajicek, John P
2011-01-01
To consider the methods available to model Alzheimer's disease (AD) progression over time to inform on the structure and development of model-based evaluations, and the future direction of modelling methods in AD. A systematic search of the health care literature was undertaken to identify methods to model disease progression in AD. Modelling methods are presented in a descriptive review. The literature search identified 42 studies presenting methods or applications of methods to model AD progression over time. The review identified 10 general modelling frameworks available to empirically model the progression of AD as part of a model-based evaluation. Seven of these general models are statistical models predicting progression of AD using a measure of cognitive function. The main concerns with models are on model structure, around the limited characterization of disease progression, and on the use of a limited number of health states to capture events related to disease progression over time. None of the available models have been able to present a comprehensive model of the natural history of AD. Although helpful, there are serious limitations in the methods available to model progression of AD over time. Advances are needed to better model the progression of AD and the effects of the disease on peoples' lives. Recent evidence supports the need for a multivariable approach to the modelling of AD progression, and indicates that a latent variable analytic approach to characterising AD progression is a promising avenue for advances in the statistical development of modelling methods. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Evaluating the uncertainty of input quantities in measurement models
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Elster, Clemens
2014-06-01
The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.
Argumentation in Science Education: A Model-Based Framework
ERIC Educational Resources Information Center
Bottcher, Florian; Meisert, Anke
2011-01-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Oscillations, neural computations and learning during wake and sleep.
Penagos, Hector; Varela, Carmen; Wilson, Matthew A
2017-06-01
Learning and memory theories consider sleep and the reactivation of waking hippocampal neural patterns to be crucial for the long-term consolidation of memories. Here we propose that precisely coordinated representations across brain regions allow the inference and evaluation of causal relationships to train an internal generative model of the world. This training starts during wakefulness and strongly benefits from sleep because its recurring nested oscillations may reflect compositional operations that facilitate a hierarchical processing of information, potentially including behavioral policy evaluations. This suggests that an important function of sleep activity is to provide conditions conducive to general inference, prediction and insight, which contribute to a more robust internal model that underlies generalization and adaptive behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Khorram, S.
1977-01-01
Results are presented of a study intended to develop a general location-specific remote-sensing procedure for watershed-wide estimation of water loss to the atmosphere by evaporation and transpiration. The general approach involves a stepwise sequence of required information definition (input data), appropriate sample design, mathematical modeling, and evaluation of results. More specifically, the remote sensing-aided system developed to evaluate evapotranspiration employs a basic two-stage two-phase sample of three information resolution levels. Based on the discussed design, documentation, and feasibility analysis to yield timely, relatively accurate, and cost-effective evapotranspiration estimates on a watershed or subwatershed basis, work is now proceeding to implement this remote sensing-aided system.
NASA Astrophysics Data System (ADS)
Bouaziz, Laurène; de Boer-Euser, Tanja; Brauer, Claudia; Drogue, Gilles; Fenicia, Fabrizio; Grelier, Benjamin; de Niel, Jan; Nossent, Jiri; Pereira, Fernando; Savenije, Hubert; Thirel, Guillaume; Willems, Patrick
2016-04-01
International collaboration between institutes and universities is a promising way to reach consensus on hydrological model development. Education, experience and expert knowledge of the hydrological community have resulted in the development of a great variety of model concepts, calibration methods and analysis techniques. Although comparison studies are very valuable for international cooperation, they do often not lead to very clear new insights regarding the relevance of the modelled processes. We hypothesise that this is partly caused by model complexity and the used comparison methods, which focus on a good overall performance instead of focusing on specific events. We propose an approach that focuses on the evaluation of specific events. Eight international research groups calibrated their model for the Ourthe catchment in Belgium (1607 km2) and carried out a validation in time for the Ourthe (i.e. on two different periods, one of them on a blind mode for the modellers) and a validation in space for nested and neighbouring catchments of the Meuse in a completely blind mode. For each model, the same protocol was followed and an ensemble of best performing parameter sets was selected. Signatures were first used to assess model performances in the different catchments during validation. Comparison of the models was then followed by evaluation of selected events, which include: low flows, high flows and the transition from low to high flows. While the models show rather similar performances based on general metrics (i.e. Nash-Sutcliffe Efficiency), clear differences can be observed for specific events. While most models are able to simulate high flows well, large differences are observed during low flows and in the ability to capture the first peaks after drier months. The transferability of model parameters to neighbouring and nested catchments is assessed as an additional measure in the model evaluation. This suggested approach helps to select, among competing model alternatives, the most suitable model for a specific purpose.
Final Report on the Multicultural/Diversity Assessment Project.
ERIC Educational Resources Information Center
Ambrosio, Anthony L.
The Emporia State University Multicultural/Diversity Project developed a set of assessment instruments and a model evaluation plan to assess multicultural/diversity (MCD) outcomes in teacher education and general education programs. Assessment instruments and techniques were constructed to evaluate the impact of coursework on student attitudes,…
A re-evaluation of PETROTOX for predicting acute and chronic toxicity of petroleum substances.
Redman, Aaron D; Parkerton, Thomas F; Leon Paumen, Miriam; Butler, Josh D; Letinski, Daniel J; den Haan, Klass
2017-08-01
The PETROTOX model was developed to perform aquatic hazard assessment of petroleum substances based on substance composition. The model relies on the hydrocarbon block method, which is widely used for conducting petroleum substance risk assessments providing further justification for evaluating model performance. Previous work described this model and provided a preliminary calibration and validation using acute toxicity data for limited petroleum substance. The objective of the present study was to re-evaluate PETROTOX using expanded data covering both acute and chronic toxicity endpoints on invertebrates, algae, and fish for a wider range of petroleum substances. The results indicated that recalibration of 2 model parameters was required, namely, the algal critical target lipid body burden and the log octanol-water partition coefficient (K OW ) limit, used to account for reduced bioavailability of hydrophobic constituents. Acute predictions from the updated model were compared with observed toxicity data and found to generally be within a factor of 3 for algae and invertebrates but overestimated fish toxicity. Chronic predictions were generally within a factor of 5 of empirical data. Furthermore, PETROTOX predicted acute and chronic hazard classifications that were consistent or conservative in 93 and 84% of comparisons, respectively. The PETROTOX model is considered suitable for the purpose of characterizing petroleum substance hazard in substance classification and risk assessments. Environ Toxicol Chem 2017;36:2245-2252. © 2017 SETAC. © 2017 SETAC.
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
Presentation of Atomic Structure in Turkish General Chemistry Textbooks
ERIC Educational Resources Information Center
Niaz, Mansoor; Costu, Bayram
2009-01-01
Research in science education has recognized the importance of teaching atomic structure within a history and philosophy of science perspective. The objective of this study is to evaluate general chemistry textbooks published in Turkey based on the eight criteria developed in previous research. Criteria used referred to the atomic models of…
Issues in Proposing a General Model of the Effects of Divorce on Children.
ERIC Educational Resources Information Center
Kurdek, Lawrence A.
1993-01-01
Responds to previous article by Amato on children's adjustment to divorce. Focuses on two aspects of Amato's review: the mechanics of the review (perspectives advanced, criteria used to evaluate generated hypotheses, and accuracy of conclusions) and critical comments raised about the existing literature and in particular the proposed general model…
Evaluating the Ocean Component of the US Navy Earth System Model
NASA Astrophysics Data System (ADS)
Zamudio, L.
2017-12-01
Ocean currents, temperature, and salinity observations are used to evaluate the ocean component of the US Navy Earth System Model. The ocean and atmosphere components of the system are an eddy-resolving (1/12.5° equatorial resolution) version of the HYbrid Coordinate Ocean Model (HYCOM), and a T359L50 version of the NAVy Global Environmental Model (NAVGEM), respectively. The system was integrated in hindcast mode and the ocean results are compared against unassimilated observations, a stand-alone version of HYCOM, and the Generalized Digital Environment Model ocean climatology. The different observation types used in the system evaluation are: drifting buoys, temperature profiles, salinity profiles, and acoustical proxies (mixed layer depth, sonic layer depth, below layer gradient, and acoustical trapping). To evaluate the system's performance in each different metric, a scorecard is used to translate the system's errors into scores, which provide an indication of the system's skill in both space and time.
Long short-term memory for speaker generalization in supervised speech separation
Chen, Jitong; Wang, DeLiang
2017-01-01
Speech separation can be formulated as learning to estimate a time-frequency mask from acoustic features extracted from noisy speech. For supervised speech separation, generalization to unseen noises and unseen speakers is a critical issue. Although deep neural networks (DNNs) have been successful in noise-independent speech separation, DNNs are limited in modeling a large number of speakers. To improve speaker generalization, a separation model based on long short-term memory (LSTM) is proposed, which naturally accounts for temporal dynamics of speech. Systematic evaluation shows that the proposed model substantially outperforms a DNN-based model on unseen speakers and unseen noises in terms of objective speech intelligibility. Analyzing LSTM internal representations reveals that LSTM captures long-term speech contexts. It is also found that the LSTM model is more advantageous for low-latency speech separation and it, without future frames, performs better than the DNN model with future frames. The proposed model represents an effective approach for speaker- and noise-independent speech separation. PMID:28679261
Yu-Kang, Tu
2016-12-01
Network meta-analysis for multiple treatment comparisons has been a major development in evidence synthesis methodology. The validity of a network meta-analysis, however, can be threatened by inconsistency in evidence within the network. One particular issue of inconsistency is how to directly evaluate the inconsistency between direct and indirect evidence with regard to the effects difference between two treatments. A Bayesian node-splitting model was first proposed and a similar frequentist side-splitting model has been put forward recently. Yet, assigning the inconsistency parameter to one or the other of the two treatments or splitting the parameter symmetrically between the two treatments can yield different results when multi-arm trials are involved in the evaluation. We aimed to show that a side-splitting model can be viewed as a special case of design-by-treatment interaction model, and different parameterizations correspond to different design-by-treatment interactions. We demonstrated how to evaluate the side-splitting model using the arm-based generalized linear mixed model, and an example data set was used to compare results from the arm-based models with those from the contrast-based models. The three parameterizations of side-splitting make slightly different assumptions: the symmetrical method assumes that both treatments in a treatment contrast contribute to inconsistency between direct and indirect evidence, whereas the other two parameterizations assume that only one of the two treatments contributes to this inconsistency. With this understanding in mind, meta-analysts can then make a choice about how to implement the side-splitting method for their analysis. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Local influence for generalized linear models with missing covariates.
Shi, Xiaoyan; Zhu, Hongtu; Ibrahim, Joseph G
2009-12-01
In the analysis of missing data, sensitivity analyses are commonly used to check the sensitivity of the parameters of interest with respect to the missing data mechanism and other distributional and modeling assumptions. In this article, we formally develop a general local influence method to carry out sensitivity analyses of minor perturbations to generalized linear models in the presence of missing covariate data. We examine two types of perturbation schemes (the single-case and global perturbation schemes) for perturbing various assumptions in this setting. We show that the metric tensor of a perturbation manifold provides useful information for selecting an appropriate perturbation. We also develop several local influence measures to identify influential points and test model misspecification. Simulation studies are conducted to evaluate our methods, and real datasets are analyzed to illustrate the use of our local influence measures.
Using modeling and rehearsal to teach fire safety to children with autism.
Garcia, David; Dukes, Charles; Brady, Michael P; Scott, Jack; Wilson, Cynthia L
2016-09-01
We evaluated the efficacy of an instructional procedure to teach young children with autism to evacuate settings and notify an adult during a fire alarm. A multiple baseline design across children showed that an intervention that included modeling, rehearsal, and praise was effective in teaching fire safety skills. Safety skills generalized to novel settings and maintained during a 5-week follow-up in both training and generalization settings. © 2016 Society for the Experimental Analysis of Behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buechler, Elizabeth D.; Pallin, Simon B.; Boudreaux, Philip R.
The indoor air temperature and relative humidity in residential buildings significantly affect material moisture durability, HVAC system performance, and occupant comfort. Therefore, indoor climate data is generally required to define boundary conditions in numerical models that evaluate envelope durability and equipment performance. However, indoor climate data obtained from field studies is influenced by weather, occupant behavior and internal loads, and is generally unrepresentative of the residential building stock. Likewise, whole-building simulation models typically neglect stochastic variables and yield deterministic results that are applicable to only a single home in a specific climate. The
Graph modeling systems and methods
Neergaard, Mike
2015-10-13
An apparatus and a method for vulnerability and reliability modeling are provided. The method generally includes constructing a graph model of a physical network using a computer, the graph model including a plurality of terminating vertices to represent nodes in the physical network, a plurality of edges to represent transmission paths in the physical network, and a non-terminating vertex to represent a non-nodal vulnerability along a transmission path in the physical network. The method additionally includes evaluating the vulnerability and reliability of the physical network using the constructed graph model, wherein the vulnerability and reliability evaluation includes a determination of whether each terminating and non-terminating vertex represents a critical point of failure. The method can be utilized to evaluate wide variety of networks, including power grid infrastructures, communication network topologies, and fluid distribution systems.
New approach to effective diffusion coefficient evaluation in the nanostructured two-phase media
NASA Astrophysics Data System (ADS)
Lyashenko, Yu. O.; Liashenko, O. Y.; Morozovich, V. V.
2018-03-01
Most widely used basic and combined models for evaluation of the effective diffusion parameters of inhomogeneous two-phase zone are reviewed. A new combined model of effective medium is analyzed for the description of diffusion processes in the two-phase zones. In this model the effective diffusivity depends on the growth kinetic coefficients of each phase, the volume fractions of phases and on the additional parameter that generally characterizes the structure type of the two-phase zone. Our combined model describes two-phase zone evolution in the binary systems based on consideration of the diffusion fluxes through both phases. The Lattice Monte Carlo method was used to test the validity of different phenomenological models for evaluation of the effective diffusivity in nanostructured two-phase zones with different structural morphology.
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2015-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. Test article, a 3.35m long with the diameter of 20 cm incline line, was filled with liquid (LH2)or gaseous hydrogen (GH2) and then purged with gaseous helium (GHe). Total of 10 tests were conducted. Influences of GHe flow rates and initial temperatures were evaluated. Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer, was utilized to model and simulate selective tests.
Probabilistic modeling of the indoor climates of residential buildings using EnergyPlus
Buechler, Elizabeth D.; Pallin, Simon B.; Boudreaux, Philip R.; ...
2017-04-25
The indoor air temperature and relative humidity in residential buildings significantly affect material moisture durability, HVAC system performance, and occupant comfort. Therefore, indoor climate data is generally required to define boundary conditions in numerical models that evaluate envelope durability and equipment performance. However, indoor climate data obtained from field studies is influenced by weather, occupant behavior and internal loads, and is generally unrepresentative of the residential building stock. Likewise, whole-building simulation models typically neglect stochastic variables and yield deterministic results that are applicable to only a single home in a specific climate. The
COMPUTERIZED SHAWNEE LIME/LIMESTONE SCRUBBING MODEL USERS MANUAL
The manual gives a general description of a computerized model for estimating design and cost of lime or limestone scrubber systems for flue gas desulfurization (FGD). It supplements PB80-123037 by extending the number of scrubber options which can be evaluated. It includes spray...
Predictive Models and Tools for Assessing Chemicals under the Toxic Substances Control Act (TSCA)
EPA has developed databases and predictive models to help evaluate the hazard, exposure, and risk of chemicals released to the environment and how workers, the general public, and the environment may be exposed to and affected by them.
Messer, Benjamin M.; Roca, Maite; Chu, Zhen T.; Vicatos, Spyridon; Kilshtain, Alexandra Vardi; Warshel, Arieh
2009-01-01
Evaluating the free energy landscape of proteins and the corresponding functional aspects presents a major challenge for computer simulation approaches. This challenge is due to the complexity of the landscape and the enormous computer time needed for converging simulations. The use of simplified coarse grained (CG) folding models offers an effective way of sampling the landscape but such a treatment, however, may not give the correct description of the effect of the actual protein residues. A general way around this problem that has been put forward in our early work (Fan et al, Theor Chem Acc (1999) 103:77-80) uses the CG model as a reference potential for free energy calculations of different properties of the explicit model. This method is refined and extended here, focusing on improving the electrostatic treatment and on demonstrating key applications. This application includes: evaluation of changes of folding energy upon mutations, calculations of transition states binding free energies (which are crucial for rational enzyme design), evaluation of catalytic landscape and simulation of the time dependent responses to pH changes. Furthermore, the general potential of our approach in overcoming major challenges in studies of structure function correlation in proteins is discussed. PMID:20052756
Marshall, Andrew J; Evanovich, Emma K; David, Sarah Jo; Mumma, Gregory H
2018-01-17
High comorbidity rates among emotional disorders have led researchers to examine transdiagnostic factors that may contribute to shared psychopathology. Bifactor models provide a unique method for examining transdiagnostic variables by modelling the common and unique factors within measures. Previous findings suggest that the bifactor model of the Depression Anxiety and Stress Scale (DASS) may provide a method for examining transdiagnostic factors within emotional disorders. This study aimed to replicate the bifactor model of the DASS, a multidimensional measure of psychological distress, within a US adult sample and provide initial estimates of the reliability of the general and domain-specific factors. Furthermore, this study hypothesized that Worry, a theorized transdiagnostic variable, would show stronger relations to general emotional distress than domain-specific subscales. Confirmatory factor analysis was used to evaluate the bifactor model structure of the DASS in 456 US adult participants (279 females and 177 males, mean age 35.9 years) recruited online. The DASS bifactor model fitted well (CFI = 0.98; RMSEA = 0.05). The General Emotional Distress factor accounted for most of the reliable variance in item scores. Domain-specific subscales accounted for modest portions of reliable variance in items after accounting for the general scale. Finally, structural equation modelling indicated that Worry was strongly predicted by the General Emotional Distress factor. The DASS bifactor model is generalizable to a US community sample and General Emotional Distress, but not domain-specific factors, strongly predict the transdiagnostic variable Worry.
The MSFC UNIVAC 1108 EXEC 8 simulation model
NASA Technical Reports Server (NTRS)
Williams, T. G.; Richards, F. M.; Weatherbee, J. E.; Paul, L. K.
1972-01-01
A model is presented which simulates the MSFC Univac 1108 multiprocessor system. The hardware/operating system is described to enable a good statistical measurement of the system behavior. The performance of the 1108 is evaluated by performing twenty-four different experiments designed to locate system bottlenecks and also to test the sensitivity of system throughput with respect to perturbation of the various Exec 8 scheduling algorithms. The model is implemented in the general purpose system simulation language and the techniques described can be used to assist in the design, development, and evaluation of multiprocessor systems.
Hüls, Anke; Frömke, Cornelia; Ickstadt, Katja; Hille, Katja; Hering, Johanna; von Münchhausen, Christiane; Hartmann, Maria; Kreienbrock, Lothar
2017-01-01
Antimicrobial resistance in livestock is a matter of general concern. To develop hygiene measures and methods for resistance prevention and control, epidemiological studies on a population level are needed to detect factors associated with antimicrobial resistance in livestock holdings. In general, regression models are used to describe these relationships between environmental factors and resistance outcome. Besides the study design, the correlation structures of the different outcomes of antibiotic resistance and structural zero measurements on the resistance outcome as well as on the exposure side are challenges for the epidemiological model building process. The use of appropriate regression models that acknowledge these complexities is essential to assure valid epidemiological interpretations. The aims of this paper are (i) to explain the model building process comparing several competing models for count data (negative binomial model, quasi-Poisson model, zero-inflated model, and hurdle model) and (ii) to compare these models using data from a cross-sectional study on antibiotic resistance in animal husbandry. These goals are essential to evaluate which model is most suitable to identify potential prevention measures. The dataset used as an example in our analyses was generated initially to study the prevalence and associated factors for the appearance of cefotaxime-resistant Escherichia coli in 48 German fattening pig farms. For each farm, the outcome was the count of samples with resistant bacteria. There was almost no overdispersion and only moderate evidence of excess zeros in the data. Our analyses show that it is essential to evaluate regression models in studies analyzing the relationship between environmental factors and antibiotic resistances in livestock. After model comparison based on evaluation of model predictions, Akaike information criterion, and Pearson residuals, here the hurdle model was judged to be the most appropriate model. PMID:28620609
NASA Astrophysics Data System (ADS)
Steen-Larsen, H. C.; Risi, C.; Werner, M.; Yoshimura, K.; Masson-Delmotte, V.
2017-01-01
The skills of isotope-enabled general circulation models are evaluated against atmospheric water vapor isotopes. We have combined in situ observations of surface water vapor isotopes spanning multiple field seasons (2010, 2011, and 2012) from the top of the Greenland Ice Sheet (NEEM site: 77.45°N, 51.05°W, 2484 m above sea level) with observations from the marine boundary layer of the North Atlantic and Arctic Ocean (Bermuda Islands 32.26°N, 64.88°W, year: 2012; south coast of Iceland 63.83°N, 21.47°W, year: 2012; South Greenland 61.21°N, 47.17°W, year: 2012; Svalbard 78.92°N, 11.92°E, year: 2014). This allows us to benchmark the ability to simulate the daily water vapor isotope variations from five different simulations using isotope-enabled general circulation models. Our model-data comparison documents clear isotope biases both on top of the Greenland Ice Sheet (1-11‰ for δ18O and 4-19‰ for d-excess depending on model and season) and in the marine boundary layer (maximum differences for the following: Bermuda δ18O = 1‰, d-excess = 3‰; South coast of Iceland δ18O = 2‰, d-excess = 5‰; South Greenland δ18O = 4‰, d-excess = 7‰; Svalbard δ18O = 2‰, d-excess = 7‰). We find that the simulated isotope biases are not just explained by simulated biases in temperature and humidity. Instead, we argue that these isotope biases are related to a poor simulation of the spatial structure of the marine boundary layer water vapor isotopic composition. Furthermore, we specifically show that the marine boundary layer water vapor isotopes of the Baffin Bay region show strong influence on the water vapor isotopes at the NEEM deep ice core-drilling site in northwest Greenland. Our evaluation of the simulations using isotope-enabled general circulation models also documents wide intermodel spatial variability in the Arctic. This stresses the importance of a coordinated water vapor isotope-monitoring network in order to discriminate amongst these model behaviors.
NASA Astrophysics Data System (ADS)
Zaherpour, Jamal; Gosling, Simon N.; Mount, Nick; Müller Schmied, Hannes; Veldkamp, Ted I. E.; Dankers, Rutger; Eisner, Stephanie; Gerten, Dieter; Gudmundsson, Lukas; Haddeland, Ingjerd; Hanasaki, Naota; Kim, Hyungjun; Leng, Guoyong; Liu, Junguo; Masaki, Yoshimitsu; Oki, Taikan; Pokhrel, Yadu; Satoh, Yusuke; Schewe, Jacob; Wada, Yoshihide
2018-06-01
Global-scale hydrological models are routinely used to assess water scarcity, flood hazards and droughts worldwide. Recent efforts to incorporate anthropogenic activities in these models have enabled more realistic comparisons with observations. Here we evaluate simulations from an ensemble of six models participating in the second phase of the Inter-Sectoral Impact Model Inter-comparison Project (ISIMIP2a). We simulate monthly runoff in 40 catchments, spatially distributed across eight global hydrobelts. The performance of each model and the ensemble mean is examined with respect to their ability to replicate observed mean and extreme runoff under human-influenced conditions. Application of a novel integrated evaluation metric to quantify the models’ ability to simulate timeseries of monthly runoff suggests that the models generally perform better in the wetter equatorial and northern hydrobelts than in drier southern hydrobelts. When model outputs are temporally aggregated to assess mean annual and extreme runoff, the models perform better. Nevertheless, we find a general trend in the majority of models towards the overestimation of mean annual runoff and all indicators of upper and lower extreme runoff. The models struggle to capture the timing of the seasonal cycle, particularly in northern hydrobelts, while in southern hydrobelts the models struggle to reproduce the magnitude of the seasonal cycle. It is noteworthy that over all hydrological indicators, the ensemble mean fails to perform better than any individual model—a finding that challenges the commonly held perception that model ensemble estimates deliver superior performance over individual models. The study highlights the need for continued model development and improvement. It also suggests that caution should be taken when summarising the simulations from a model ensemble based upon its mean output.
Evaluating a health behaviour model for persons with and without an intellectual disability.
Brehmer-Rinderer, B; Zigrovic, L; Weber, G
2014-06-01
Based on the idea of the Common Sense Model of Illness Representations by Leventhal as well as Lohaus's concepts of health and illness, a health behaviour model was designed to explain health behaviours applied by persons with intellectual disabilities (ID). The key proposal of this model is that the way someone understands the concepts of health, illness and disability influences the way they perceive themselves and what behavioural approaches to them they take. To test this model and explain health differences between the general population and person with ID, 230 people with ID and a comparative sample of 533 persons without ID were included in this Austrian study. Data were collected on general socio-demographics, personal perceptions of illness and disability, perceptions of oneself and health-related behaviours. Psychometric analysis of the instruments used showed that they were valid and reliable and hence can provide a valuable tool for studying health-related issues in persons with and without ID. With respect to the testing of the suggested health model, two latent variables were defined in accordance to the theory. The general model fit was evaluated by calculating different absolute and descriptive fit indices. Most indices indicated an acceptable model fit for all samples. This study presents the first attempt to explore the systematic differences in health behaviour between people with and without ID based on a suggested health model. Limitations of the study as well as implications for practice and future research are discussed. © 2013 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.
Yunjun Yao; Shunlin Liang; Xianglan Li; Shaomin Liu; Jiquan Chen; Xiaotong Zhang; Kun Jia; Bo Jiang; Xianhong Xie; Simon Munier; Meng Liu; Jian Yu; Anders Lindroth; Andrej Varlagin; Antonio Raschi; Asko Noormets; Casimiro Pio; Georg Wohlfahrt; Ge Sun; Jean-Christophe Domec; Leonardo Montagnani; Magnus Lund; Moors Eddy; Peter D. Blanken; Thomas Grunwald; Sebastian Wolf; Vincenzo Magliulo
2016-01-01
The latent heat flux (LE) between the terrestrial biosphere and atmosphere is a major driver of the globalhydrological cycle. In this study, we evaluated LE simulations by 45 general circulation models (GCMs)in the Coupled Model Intercomparison Project Phase 5 (CMIP5) by a comparison...
ERIC Educational Resources Information Center
Reeb, Roger N.; Snow-Hill, Nyssa L.; Folger, Susan F.; Steel, Anne L.; Stayton, Laura; Hunt, Charles A.; O'Koon, Bernadette; Glendening, Zachary
2017-01-01
This article presents the Psycho-Ecological Systems Model (PESM)--an integrative conceptual model rooted in General Systems Theory (GST). PESM was developed to inform and guide the development, implementation, and evaluation of transdisciplinary (and multilevel) community-engaged scholarship (e.g., a participatory community action research project…
Exponentiated power Lindley distribution.
Ashour, Samir K; Eltehiwy, Mahmoud A
2015-11-01
A new generalization of the Lindley distribution is recently proposed by Ghitany et al. [1], called as the power Lindley distribution. Another generalization of the Lindley distribution was introduced by Nadarajah et al. [2], named as the generalized Lindley distribution. This paper proposes a more generalization of the Lindley distribution which generalizes the two. We refer to this new generalization as the exponentiated power Lindley distribution. The new distribution is important since it contains as special sub-models some widely well-known distributions in addition to the above two models, such as the Lindley distribution among many others. It also provides more flexibility to analyze complex real data sets. We study some statistical properties for the new distribution. We discuss maximum likelihood estimation of the distribution parameters. Least square estimation is used to evaluate the parameters. Three algorithms are proposed for generating random data from the proposed distribution. An application of the model to a real data set is analyzed using the new distribution, which shows that the exponentiated power Lindley distribution can be used quite effectively in analyzing real lifetime data.
Nikjou, A; Sadeghi, M
2018-06-01
The 123 I radionuclide (T 1/2 = 13.22 h, β+ = 100%) is one of the most potent gamma emitters for nuclear medicine. In this study, the cyclotron production of this radionuclide via different nuclear reactions namely, the 121 Sb(α,2n), 122 Te(d,n), 123 Te(p,n), 124 Te(p,2n), 124 Xe(p,2n), 127 I(p,5n) and 127 I(d,6n) were investigated. The effect of the various phenomenological nuclear level density models such as Fermi gas model (FGM), Back-shifted Fermi gas model (BSFGM), Generalized superfluid model (GSM) and Enhanced generalized superfluid model (EGSM) moreover, the three microscopic level density models were evaluated for predicting of cross sections and production yield predictions. The SRIM code was used to obtain the target thickness. The 123 I excitation function of reactions were calculated by using of the TALYS-1.8, EMPIRE-3.2 nuclear codes and with data which taken from TENDL-2015 database, and finally the theoretical calculations were compared with reported experimental measurements in which taken from EXFOR database. Copyright © 2018 Elsevier Ltd. All rights reserved.
The evaluation of the OSGLR algorithm for restructurable controls
NASA Technical Reports Server (NTRS)
Bonnice, W. F.; Wagner, E.; Hall, S. R.; Motyka, P.
1986-01-01
The detection and isolation of commercial aircraft control surface and actuator failures using the orthogonal series generalized likelihood ratio (OSGLR) test was evaluated. The OSGLR algorithm was chosen as the most promising algorithm based on a preliminary evaluation of three failure detection and isolation (FDI) algorithms (the detection filter, the generalized likelihood ratio test, and the OSGLR test) and a survey of the literature. One difficulty of analytic FDI techniques and the OSGLR algorithm in particular is their sensitivity to modeling errors. Therefore, methods of improving the robustness of the algorithm were examined with the incorporation of age-weighting into the algorithm being the most effective approach, significantly reducing the sensitivity of the algorithm to modeling errors. The steady-state implementation of the algorithm based on a single cruise linear model was evaluated using a nonlinear simulation of a C-130 aircraft. A number of off-nominal no-failure flight conditions including maneuvers, nonzero flap deflections, different turbulence levels and steady winds were tested. Based on the no-failure decision functions produced by off-nominal flight conditions, the failure detection performance at the nominal flight condition was determined. The extension of the algorithm to a wider flight envelope by scheduling the linear models used by the algorithm on dynamic pressure and flap deflection was also considered. Since simply scheduling the linear models over the entire flight envelope is unlikely to be adequate, scheduling of the steady-state implentation of the algorithm was briefly investigated.
Yu, Lei; Kang, Jian
2009-09-01
This research aims to explore the feasibility of using computer-based models to predict the soundscape quality evaluation of potential users in urban open spaces at the design stage. With the data from large scale field surveys in 19 urban open spaces across Europe and China, the importance of various physical, behavioral, social, demographical, and psychological factors for the soundscape evaluation has been statistically analyzed. Artificial neural network (ANN) models have then been explored at three levels. It has been shown that for both subjective sound level and acoustic comfort evaluation, a general model for all the case study sites is less feasible due to the complex physical and social environments in urban open spaces; models based on individual case study sites perform well but the application range is limited; and specific models for certain types of location/function would be reliable and practical. The performance of acoustic comfort models is considerably better than that of sound level models. Based on the ANN models, soundscape quality maps can be produced and this has been demonstrated with an example.
Generalized linear mixed models with varying coefficients for longitudinal data.
Zhang, Daowen
2004-03-01
The routinely assumed parametric functional form in the linear predictor of a generalized linear mixed model for longitudinal data may be too restrictive to represent true underlying covariate effects. We relax this assumption by representing these covariate effects by smooth but otherwise arbitrary functions of time, with random effects used to model the correlation induced by among-subject and within-subject variation. Due to the usually intractable integration involved in evaluating the quasi-likelihood function, the double penalized quasi-likelihood (DPQL) approach of Lin and Zhang (1999, Journal of the Royal Statistical Society, Series B61, 381-400) is used to estimate the varying coefficients and the variance components simultaneously by representing a nonparametric function by a linear combination of fixed effects and random effects. A scaled chi-squared test based on the mixed model representation of the proposed model is developed to test whether an underlying varying coefficient is a polynomial of certain degree. We evaluate the performance of the procedures through simulation studies and illustrate their application with Indonesian children infectious disease data.
Influence of Wind Model Performance on Wave Forecasts of the Naval Oceanographic Office
NASA Astrophysics Data System (ADS)
Gay, P. S.; Edwards, K. L.
2017-12-01
Significant discrepancies between the Naval Oceanographic Office's significant wave height (SWH) predictions and observations have been noted in some model domains. The goal of this study is to evaluate these discrepancies and identify to what extent inaccuracies in the wind predictions may explain inaccuracies in SWH predictions. A one-year time series of data is evaluated at various locations in Southern California and eastern Florida. Correlations are generally quite good, ranging from 73% at Pendleton to 88% at both Santa Barbara, California, and Cape Canaveral, Florida. Correlations for month-long periods off Southern California drop off significantly in late spring through early autumn - less so off eastern Florida - likely due to weaker local wind seas and generally smaller SWH in addition to the influence of remotely-generated swell, which may not propagate accurately into and through the wave models. The results of this study suggest that it is likely that a change in meteorological and/or oceanographic conditions explains the change in model performance, partially as a result of a seasonal reduction in wind model performance in the summer months.
NASA Astrophysics Data System (ADS)
Tadesse, T.; Bayissa, Y. A.; Demisse, G. B.; Wardlow, B.
2017-12-01
The National Drought Mitigation Center (NDMC) funded by NASA has developed a new tool for predicting the general vegetation condition called: the "Vegetation outlook for the Greater Africa (VegOut-GHA)." In this study, the 2015/16 drought across the GHA that has been considered one of the worst in decades across the region was assessed and evaluated using the VegOut-GHA models and products. The VegOut-GHA maps (hindsight prediction maps) for the growing season (June - September) were generated to predict a standardized seasonal greenness (SSG) that is based on seasonally integrated normalized difference vegetation index (a measure that represents a general indicator of relative vegetation health within a growing season). The vegetation condition outlooks were made for 10-day, 1-month, 2-month, and 3-month in hindsight and compared to the observed values of the SSG. The VegOut-GHA model was evaluated and compared to crop yield and other satellite-derived data (e.g., standardized seasonal precipitation based on "Enhancing National Climate Services (ENACTS)" datasets for GHA). Thus, the VegOut-GHA model and its evaluation results will be discussed based on the 2015/2016 drought season in the region. This preliminary results suggest an opportunity to improve management of drought risk in agriculture and food security.
Geochemistry of the Madison and Minnelusa aquifers in the Black Hills area, South Dakota
Naus, Cheryl A.; Driscoll, Daniel G.; Carter, Janet M.
2001-01-01
The Madison and Minnelusa aquifers are two of the most important aquifers in the Black Hills area because of utilization for water supplies and important influences on surface-water resources resulting from large springs and streamflow- loss zones. Examination of geochemical information provides a better understanding of the complex flow systems within these aquifers and interactions between the aquifers. Major-ion chemistry in both aquifers is dominated by calcium and bicarbonate near outcrop areas, with basinward evolution towards various other water types. The most notable differences in major-ion chemistry between the Madison and Minnelusa aquifers are in concentrations of sulfate within the Minnelusa aquifer. Sulfate concentrations increase dramatically near a transition zone where dissolution of anhydrite is actively occurring. Water chemistry for the Madison and Minnelusa aquifers is controlled by reactions among calcite, dolomite, and anhydrite. Saturation indices for gypsum, calcite, and dolomite for most samples in both the Madison and Minnelusa aquifers are indicative of the occurrence of dedolomitization. Because water in the Madison aquifer remains undersaturated with respect to gypsum, even at the highest sulfate concentrations, upward leakage into the overlying Minnelusa aquifer has potential to drive increased dissolution of anhydrite in the Minnelusa Formation. Isotopic information is used to evaluate ground-water flowpaths, ages, and mixing conditions for the Madison and Minnelusa aquifers. Distinctive patterns exist in the distribution of stable isotopes of oxygen and hydrogen in precipitation for the Black Hills area, with isotopically lighter precipitation generally occurring at higher elevations and latitudes. Distributions of 18O in ground water are consistent with spatial patterns in recharge areas, with isotopically lighter 18O values in the Madison aquifer resulting from generally higher elevation recharge sources, relative to the Minnelusa aquifer. Three conceptual models, which are simplifications of lumped-parameter models, are considered for evaluation of mixing conditions and general ground-water ages. For a simple slug-flow model, which assumes no mixing, measured tritium concentrations in ground water can be related through a first-order decay equation to estimated concentrations at the time of recharge. Two simplified mixing models that assume equal proportions of annual recharge over a range of years also are considered. An ?immediate-arrival? model is used to conceptually represent conditions in outcrop areas and a ?time-delay? model is used for locations removed from outcrops, where delay times for earliest arrival of ground water generally would be expected. Because of limitations associated with estimating tritium input and gross simplifying assumptions of equal annual recharge and thorough mixing conditions, the conceptual models are used only for general evaluation of mixing conditions and approximation of age ranges. Headwater springs, which are located in or near outcrop areas, have the highest tritium concentrations, which is consistent with the immediate-arrival mixing model. Tritium concentrations for many wells are very low, or nondetectable, indicating general applicability of the timedelay conceptual model for locations beyond outcrop areas, where artesian conditions generally occur. Concentrations for artesian springs generally are higher than for wells, which indicates generally shorter delay times resulting from preferential flowpaths that typically are associated with artesian springs. In the Rapid City area, a distinct division of isotopic values for the Madison aquifer corresponds with distinguishing 18O signatures for nearby streams, where large streamflow recharge occurs. Previous dye testing in this area documented rapid ground-water flow (timeframe of weeks) from a streamflow loss zone to sites located several miles away. These results are used to ill
ERIC Educational Resources Information Center
Sun, Sanjun
2012-01-01
Accurate assessment of a text's level of translation difficulty is critical for translator training and accreditation, translation research, and the language industry as well. Traditionally, people rely on their general impression to gauge a text's translation difficulty level. If the evaluation process is to be more effective and the…
Explicit criteria for prioritization of cataract surgery
Ma Quintana, José; Escobar, Antonio; Bilbao, Amaia
2006-01-01
Background Consensus techniques have been used previously to create explicit criteria to prioritize cataract extraction; however, the appropriateness of the intervention was not included explicitly in previous studies. We developed a prioritization tool for cataract extraction according to the RAND method. Methods Criteria were developed using a modified Delphi panel judgment process. A panel of 11 ophthalmologists was assembled. Ratings were analyzed regarding the level of agreement among panelists. We studied the effect of all variables on the final panel score using general linear and logistic regression models. Priority scoring systems were developed by means of optimal scaling and general linear models. The explicit criteria developed were summarized by means of regression tree analysis. Results Eight variables were considered to create the indications. Of the 310 indications that the panel evaluated, 22.6% were considered high priority, 52.3% intermediate priority, and 25.2% low priority. Agreement was reached for 31.9% of the indications and disagreement for 0.3%. Logistic regression and general linear models showed that the preoperative visual acuity of the cataractous eye, visual function, and anticipated visual acuity postoperatively were the most influential variables. Alternative and simple scoring systems were obtained by optimal scaling and general linear models where the previous variables were also the most important. The decision tree also shows the importance of the previous variables and the appropriateness of the intervention. Conclusion Our results showed acceptable validity as an evaluation and management tool for prioritizing cataract extraction. It also provides easy algorithms for use in clinical practice. PMID:16512893
On the factor structure of the Rosenberg (1965) General Self-Esteem Scale.
Alessandri, Guido; Vecchione, Michele; Eisenberg, Nancy; Łaguna, Mariola
2015-06-01
Since its introduction, the Rosenberg General Self-Esteem Scale (RGSE, Rosenberg, 1965) has been 1 of the most widely used measures of global self-esteem. We conducted 4 studies to investigate (a) the goodness-of-fit of a bifactor model positing a general self-esteem (GSE) factor and 2 specific factors grouping positive (MFP) and negative items (MFN) and (b) different kinds of validity of the GSE, MFN, and MFP factors of the RSGE. In the first study (n = 11,028), the fit of the bifactor model was compared with those of 9 alternative models proposed in literature for the RGSE. In Study 2 (n = 357), the external validities of GSE, MFP, and MFN were evaluated using objective grade point average data and multimethod measures of prosociality, aggression, and depression. In Study 3 (n = 565), the across-rater robustness of the bifactor model was evaluated. In Study 4, measurement invariance of the RGSE was further supported across samples in 3 European countries, Serbia (n = 1,010), Poland (n = 699), and Italy (n = 707), and in the United States (n = 1,192). All in all, psychometric findings corroborate the value and the robustness of the bifactor structure and its substantive interpretation. (c) 2015 APA, all rights reserved).
Armeni, Patrizio; Compagni, Amelia; Longo, Francesco
2014-08-01
Multiprofessional primary care models promise to deliver better care and reduce waste. This study evaluates the impact of such a model, the primary care unit (PCU), on three outcomes. A multilevel analysis within a "pre- and post-PCU" study design and a cross-sectional analysis were conducted on 215 PCUs located in the Emilia-Romagna region in Italy. Seven dimensions captured a set of processes and services characterizing a well-functioning PCU, or its degree of vitality. The impact of each dimension on outcomes was evaluated. The analyses show that certain dimensions of PCU vitality (i.e., the possibility for general practitioners to meet and share patients) can lead to better outcomes. However, dimensions related to the interaction and the joint works of general practitioners with other professionals tend not to have a significant or positive impact. This suggests that more effort needs to be invested to realize all the potential benefits of the PCU's multiprofessional approach to care. © The Author(s) 2014.
2009-12-01
SWISS CHEESE ” MODEL........................................... 16 1. Errors and Violations...16 Figure 5. Reason’s Swiss Cheese Model (After: Reason, 1990, p. 208) ........... 20 Figure 6. The HFACS Swiss Cheese Model of...become more complex. E. REASON’S “ SWISS CHEESE ” MODEL Reason’s (1990) book, Human Error, is generally regarded as the seminal work on the subject
On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit
ERIC Educational Resources Information Center
Savalei, Victoria; Yuan, Ke-Hai
2009-01-01
Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…
Towards a covariance matrix of CAB model parameters for H(H2O)
NASA Astrophysics Data System (ADS)
Scotta, Juan Pablo; Noguere, Gilles; Damian, José Ignacio Marquez
2017-09-01
Preliminary results on the uncertainties of hydrogen into light water thermal scattering law of the CAB model are presented. It was done through a coupling between the nuclear data code CONRAD and the molecular dynamic simulations code GROMACS. The Generalized Least Square method was used to adjust the model parameters on evaluated data and generate covariance matrices between the CAB model parameters.
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2015-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. The test article, a 3.35 m long with the diameter of 20 cm incline line, was filled with liquid or gaseous hydrogen and then purged with gaseous helium (GHe). Total of 10 tests were conducted. The influences of GHe flow rates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer computer program, was utilized to model and simulate selective tests. The test procedures, modeling descriptions, and the results are presented in the following sections.
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2013-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. The test article, a 3.35 m long with the diameter of 20 cm incline line, was filled with liquid or gaseous hydrogen and then purged with gaseous helium (GHe). Total of 10 tests were conducted. The influences of GHe flow rates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer computer program, was utilized to model and simulate selective tests. The test procedures, modeling descriptions, and the results are presented in the following sections.
NASA Astrophysics Data System (ADS)
Kaiser, Christopher; Hendricks, Johannes; Righi, Mattia; Jöckel, Patrick
2016-04-01
The reliability of aerosol radiative forcing estimates from climate models depends on the accuracy of simulated global aerosol distribution and composition, as well as on the models' representation of the aerosol-cloud and aerosol-radiation interactions. To help improve on previous modeling studies, we recently developed the new aerosol microphysics submodel MADE3 that explicitly tracks particle mixing state in the Aitken, accumulation, and coarse mode size ranges. We implemented MADE3 into the global atmospheric chemistry general circulation model EMAC and evaluated it by comparison of simulated aerosol properties to observations. Compared properties include continental near-surface aerosol component concentrations and size distributions, continental and marine aerosol vertical profiles, and nearly global aerosol optical depth. Recent studies have shown the specific importance of aerosol vertical profiles for determination of the aerosol radiative forcing. Therefore, our focus here is on the evaluation of simulated vertical profiles. The observational data is taken from campaigns between 1990 and 2011 over the Pacific Ocean, over North and South America, and over Europe. The datasets include black carbon and total aerosol mass mixing ratios, as well as aerosol particle number concentrations. Compared to other models, EMAC with MADE3 yields good agreement with the observations - despite a general high bias of the simulated mass mixing ratio profiles. However, BC concentrations are generally overestimated by many models in the upper troposphere. With MADE3 in EMAC, we find better agreement of the simulated BC profiles with HIPPO data than the multi-model average of the models that took part in the AeroCom project. There is an interesting difference between the profiles from individual campaigns and more "climatological" datasets. For instance, compared to spatially and temporally localized campaigns, the model simulates a more continuous decline in both total aerosol and black carbon mass mixing ratio with altitude than found in the observations. In contrast, measured profiles from the HIPPO project are qualitatively captured well. Similar conclusions hold for the comparison of simulated and measured aerosol particle number concentrations. On the one hand, these results exemplify the difficulty in evaluating the representativeness of the simulated global climatological state of the aerosol by means of comparison with individually measured vertical profiles. On the other hand, it highlights the value of aircraft campaigns with large spatial and temporal coverage for model evaluation.
Generalized One-Band Model Based on Zhang-Rice Singlets for Tetragonal CuO.
Hamad, I J; Manuel, L O; Aligia, A A
2018-04-27
Tetragonal CuO (T-CuO) has attracted attention because of its structure similar to that of the cuprates. It has been recently proposed as a compound whose study can give an end to the long debate about the proper microscopic modeling for cuprates. In this work, we rigorously derive an effective one-band generalized t-J model for T-CuO, based on orthogonalized Zhang-Rice singlets, and make an estimative calculation of its parameters, based on previous ab initio calculations. By means of the self-consistent Born approximation, we then evaluate the spectral function and the quasiparticle dispersion for a single hole doped in antiferromagnetically ordered half filled T-CuO. Our predictions show very good agreement with angle-resolved photoemission spectra and with theoretical multiband results. We conclude that a generalized t-J model remains the minimal Hamiltonian for a correct description of single-hole dynamics in cuprates.
Generalized One-Band Model Based on Zhang-Rice Singlets for Tetragonal CuO
NASA Astrophysics Data System (ADS)
Hamad, I. J.; Manuel, L. O.; Aligia, A. A.
2018-04-01
Tetragonal CuO (T-CuO) has attracted attention because of its structure similar to that of the cuprates. It has been recently proposed as a compound whose study can give an end to the long debate about the proper microscopic modeling for cuprates. In this work, we rigorously derive an effective one-band generalized t -J model for T-CuO, based on orthogonalized Zhang-Rice singlets, and make an estimative calculation of its parameters, based on previous ab initio calculations. By means of the self-consistent Born approximation, we then evaluate the spectral function and the quasiparticle dispersion for a single hole doped in antiferromagnetically ordered half filled T-CuO. Our predictions show very good agreement with angle-resolved photoemission spectra and with theoretical multiband results. We conclude that a generalized t -J model remains the minimal Hamiltonian for a correct description of single-hole dynamics in cuprates.
Evaluation of natural language processing systems: Issues and approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guida, G.; Mauri, G.
This paper encompasses two main topics: a broad and general analysis of the issue of performance evaluation of NLP systems and a report on a specific approach developed by the authors and experimented on a sample test case. More precisely, it first presents a brief survey of the major works in the area of NLP systems evaluation. Then, after introducing the notion of the life cycle of an NLP system, it focuses on the concept of performance evaluation and analyzes the scope and the major problems of the investigation. The tools generally used within computer science to assess the qualitymore » of a software system are briefly reviewed, and their applicability to the task of evaluation of NLP systems is discussed. Particular attention is devoted to the concepts of efficiency, correctness, reliability, and adequacy, and how all of them basically fail in capturing the peculiar features of performance evaluation of an NLP system is discussed. Two main approaches to performance evaluation are later introduced; namely, black-box- and model-based, and their most important characteristics are presented. Finally, a specific model for performance evaluation proposed by the authors is illustrated, and the results of an experiment with a sample application are reported. The paper concludes with a discussion on research perspective, open problems, and importance of performance evaluation to industrial applications.« less
Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets
NASA Technical Reports Server (NTRS)
Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.
1978-01-01
A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.
The number of chemicals with limited toxicological information for chemical safety decision-making has accelerated alternative model development, which often are evaluated via referencing animal toxicology studies. In vivo studies are generally considered the standard for hazard ...
Alternative models developed for estimating acute systemic toxicity are generally evaluated using in vivo LD50 values. However, in vivo acute systemic toxicity studies can produce variable results, even when conducted according to accepted test guidelines. This variability can ma...
Alternative models developed for estimating acute systemic toxicity are generally evaluated using in vivo LD50 values. However, in vivo acute systemic toxicity studies can produce variable results, even when conducted according to accepted test guidelines. This variability can ma...
Applications of Diagnostic Classification Models: A Literature Review and Critical Commentary
ERIC Educational Resources Information Center
Sessoms, John; Henson, Robert A.
2018-01-01
Diagnostic classification models (DCMs) classify examinees based on the skills they have mastered given their test performance. This classification enables targeted feedback that can inform remedial instruction. Unfortunately, applications of DCMs have been criticized (e.g., no validity support). Generally, these evaluations have been brief and…
Resource Manual for Teacher Training Programs in Economics.
ERIC Educational Resources Information Center
Saunders, Phillip, Ed.; And Others
This resource manual uses a general systems model for educational planning, instruction, and evaluation to describe a college introductory economics course. The goal of the manual is to help beginning or experienced instructors teach more effectively. The model components include needs, goals, objectives, constraints, planning and strategy,…
Airframe noise prediction evaluation
NASA Technical Reports Server (NTRS)
Yamamoto, Kingo J.; Donelson, Michael J.; Huang, Shumei C.; Joshi, Mahendra C.
1995-01-01
The objective of this study is to evaluate the accuracy and adequacy of current airframe noise prediction methods using available airframe noise measurements from tests of a narrow body transport (DC-9) and a wide body transport (DC-10) in addition to scale model test data. General features of the airframe noise from these aircraft and models are outlined. The results of the assessment of two airframe prediction methods, Fink's and Munson's methods, against flight test data of these aircraft and scale model wind tunnel test data are presented. These methods were extensively evaluated against measured data from several configurations including clean, slat deployed, landing gear-deployed, flap deployed, and landing configurations of both DC-9 and DC-10. They were also assessed against a limited number of configurations of scale models. The evaluation was conducted in terms of overall sound pressure level (OASPL), tone corrected perceived noise level (PNLT), and one-third-octave band sound pressure level (SPL).
NASA Astrophysics Data System (ADS)
Vora, V. P.; Mahmassani, H. S.
2002-02-01
This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.
Evaluation of Aerosol-cloud Interaction in the GISS Model E Using ARM Observations
NASA Technical Reports Server (NTRS)
DeBoer, G.; Bauer, S. E.; Toto, T.; Menon, Surabi; Vogelmann, A. M.
2013-01-01
Observations from the US Department of Energy's Atmospheric Radiation Measurement (ARM) program are used to evaluate the ability of the NASA GISS ModelE global climate model in reproducing observed interactions between aerosols and clouds. Included in the evaluation are comparisons of basic meteorology and aerosol properties, droplet activation, effective radius parameterizations, and surface-based evaluations of aerosol-cloud interactions (ACI). Differences between the simulated and observed ACI are generally large, but these differences may result partially from vertical distribution of aerosol in the model, rather than the representation of physical processes governing the interactions between aerosols and clouds. Compared to the current observations, the ModelE often features elevated droplet concentrations for a given aerosol concentration, indicating that the activation parameterizations used may be too aggressive. Additionally, parameterizations for effective radius commonly used in models were tested using ARM observations, and there was no clear superior parameterization for the cases reviewed here. This lack of consensus is demonstrated to result in potentially large, statistically significant differences to surface radiative budgets, should one parameterization be chosen over another.
Testing the Simple Biosphere model (SiB) using point micrometeorological and biophysical data
NASA Technical Reports Server (NTRS)
Sellers, P. J.; Dorman, J. L.
1987-01-01
The suitability of the Simple Biosphere (SiB) model of Sellers et al. (1986) for calculation of the surface fluxes for use within general circulation models is assessed. The structure of the SiB model is described, and its performance is evaluated in terms of its ability to realistically and accurately simulate biophysical processes over a number of test sites, including Ruthe (Germany), South Carolina (U.S.), and Central Wales (UK), for which point biophysical and micrometeorological data were available. The model produced simulations of the energy balances of barley, wheat, maize, and Norway Spruce sites over periods ranging from 1 to 40 days. Generally, it was found that the model reproduced time series of latent, sensible, and ground-heat fluxes and surface radiative temperature comparable with the available data.
Terrain modeling for microwave landing system
NASA Technical Reports Server (NTRS)
Poulose, M. M.
1991-01-01
A powerful analytical approach for evaluating the terrain effects on a microwave landing system (MLS) is presented. The approach combines a multiplate model with a powerful and exhaustive ray tracing technique and an accurate formulation for estimating the electromagnetic fields due to the antenna array in the presence of terrain. Both uniform theory of diffraction (UTD) and impedance UTD techniques have been employed to evaluate these fields. Innovative techniques are introduced at each stage to make the model versatile to handle most general terrain contours and also to reduce the computational requirement to a minimum. The model is applied to several terrain geometries, and the results are discussed.
A Catchment-Based Land Surface Model for GCMs and the Framework for its Evaluation
NASA Technical Reports Server (NTRS)
Ducharen, A.; Koster, R. D.; Suarez, M. J.; Kumar, P.
1998-01-01
A new GCM-scale land surface modeling strategy that explicitly accounts for subgrid soil moisture variability and its effects on evaporation and runoff is now being explored. In a break from traditional modeling strategies, the continental surface is disaggregated into a mosaic of hydrological catchments, with boundaries that are not dictated by a regular grid but by topography. Within each catchment, the variability of soil moisture is deduced from TOP-MODEL equations with a special treatment of the unsaturated zone. This paper gives an overview of this new approach and presents the general framework for its off-line evaluation over North-America.
ERIC Educational Resources Information Center
Bond, James T.; And Others
The first of two volumes, this document reports an evaluation of Project Developmental Continuity (PDC), a Head Start demonstration project initiated in 1974 to develop program models which enhance children's social competence by fostering developmental continuity from preschool through the early elementary years. In general, the impact of program…
Evaluating a Policing Strategy Intended to Disrupt an Illicit Street-Level Drug Market
ERIC Educational Resources Information Center
Corsaro, Nicholas; Brunson, Rod K.; McGarrell, Edmund F.
2010-01-01
The authors examined a strategic policing initiative that was implemented in a high crime Nashville, Tennessee neighborhood by utilizing a mixed-methodological evaluation approach in order to provide (a) a descriptive process assessment of program fidelity; (b) an interrupted time-series analysis relying upon generalized linear models; (c)…
A multigear protocol for sampling crayfish assemblages in Gulf of Mexico coastal streams
William R. Budnick; William E. Kelso; Susan B. Adams; Michael D. Kaller
2018-01-01
Identifying an effective protocol for sampling crayfish in streams that vary in habitat and physical/chemical characteristics has proven problematic. We evaluated an active, combined-gear (backpack electrofishing and dipnetting) sampling protocol in 20 Coastal Plain streams in Louisiana. Using generalized linear models and rarefaction curves, we evaluated environmental...
Risk in fire management decisionmaking: techniques and criteria
Gail Blatternberger; William F. Hyde; Thomas J. Mills
1984-01-01
In the past, decisionmaking in wildland fire management generally has not included a full consideration of the risk and uncertainty that is inherent in evaluating alternatives. Fire management policies in some Federal land management agencies now require risk evaluation. The model for estimating the economic efficiency of fire program alternatives is the minimization...
The Interface Between Theory and Data in Structural Equation Models
Grace, James B.; Bollen, Kenneth A.
2006-01-01
Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite, for representing general concepts. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling general relationships of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially reduced form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influences of suites of variables are often of interest.
Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan
2013-04-01
Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnell, J. L.; Prather, M. J.; Josse, B.
Here we test the current generation of global chemistry–climate models in their ability to simulate observed, present-day surface ozone. Models are evaluated against hourly surface ozone from 4217 stations in North America and Europe that are averaged over 1° × 1° grid cells, allowing commensurate model–measurement comparison. Models are generally biased high during all hours of the day and in all regions. Most models simulate the shape of regional summertime diurnal and annual cycles well, correctly matching the timing of hourly (~ 15:00 local time (LT)) and monthly (mid-June) peak surface ozone abundance. The amplitude of these cycles is lessmore » successfully matched. The observed summertime diurnal range (~ 25 ppb) is underestimated in all regions by about 7 ppb, and the observed seasonal range (~ 21 ppb) is underestimated by about 5 ppb except in the most polluted regions, where it is overestimated by about 5 ppb. The models generally match the pattern of the observed summertime ozone enhancement, but they overestimate its magnitude in most regions. Most models capture the observed distribution of extreme episode sizes, correctly showing that about 80 % of individual extreme events occur in large-scale, multi-day episodes of more than 100 grid cells. The models also match the observed linear relationship between episode size and a measure of episode intensity, which shows increases in ozone abundance by up to 6 ppb for larger-sized episodes. Lastly, we conclude that the skill of the models evaluated here provides confidence in their projections of future surface ozone.« less
Schnell, J. L.; Prather, M. J.; Josse, B.; ...
2015-09-25
Here we test the current generation of global chemistry–climate models in their ability to simulate observed, present-day surface ozone. Models are evaluated against hourly surface ozone from 4217 stations in North America and Europe that are averaged over 1° × 1° grid cells, allowing commensurate model–measurement comparison. Models are generally biased high during all hours of the day and in all regions. Most models simulate the shape of regional summertime diurnal and annual cycles well, correctly matching the timing of hourly (~ 15:00 local time (LT)) and monthly (mid-June) peak surface ozone abundance. The amplitude of these cycles is lessmore » successfully matched. The observed summertime diurnal range (~ 25 ppb) is underestimated in all regions by about 7 ppb, and the observed seasonal range (~ 21 ppb) is underestimated by about 5 ppb except in the most polluted regions, where it is overestimated by about 5 ppb. The models generally match the pattern of the observed summertime ozone enhancement, but they overestimate its magnitude in most regions. Most models capture the observed distribution of extreme episode sizes, correctly showing that about 80 % of individual extreme events occur in large-scale, multi-day episodes of more than 100 grid cells. The models also match the observed linear relationship between episode size and a measure of episode intensity, which shows increases in ozone abundance by up to 6 ppb for larger-sized episodes. Lastly, we conclude that the skill of the models evaluated here provides confidence in their projections of future surface ozone.« less
Tropical disturbances in relation to general circulation modeling
NASA Technical Reports Server (NTRS)
Estoque, M. A.
1982-01-01
The initial results of an evaluation of the performance of the Goddard Laboratory of Atmospheric Simulation general circulation model depicting the tropical atmosphere during the summer are presented. Because the results show the existence of tropical wave disturbances throughout the tropics, the characteristics of synoptic disturbances over Africa were studied and a synoptic case study of a selected disturbance in this area was conducted. It is shown that the model is able to reproduce wave type synoptic disturbances in the tropics. The findings show that, in one of the summers simulated, the disturbances are predominantly closed vortices; in another summer, the predominant disturbances are open waves.
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Annis, Charles; Sabbagh, Harold A.; Lindgren, Eric A.
2016-02-01
A comprehensive approach to NDE and SHM characterization error (CE) evaluation is presented that follows the framework of the `ahat-versus-a' regression analysis for POD assessment. Characterization capability evaluation is typically more complex with respect to current POD evaluations and thus requires engineering and statistical expertise in the model-building process to ensure all key effects and interactions are addressed. Justifying the statistical model choice with underlying assumptions is key. Several sizing case studies are presented with detailed evaluations of the most appropriate statistical model for each data set. The use of a model-assisted approach is introduced to help assess the reliability of NDE and SHM characterization capability under a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and vibration-based SHM case studies. The results of these studies highlight the general protocol feasibility, emphasize the importance of evaluating key application characteristics prior to the study, and demonstrate an approach to quantify the role of varying SHM sensor durability and environmental conditions on characterization performance.
Evaluating Economic Impacts of Expanded Global Wood Energy Consumption with the USFPM/GFPM Model
Peter J. Ince; Andrew Kramp; Kenneth E. Skog
2012-01-01
A U.S. forest sector market module was developed within the general Global Forest Products Model. The U.S. module tracks regional timber markets, timber harvests by species group, and timber product outputs in greater detail than does the global model. This hybrid approach provides detailed regional market analysis for the United States although retaining the...
An Assessment of the Nonparametric Approach for Evaluating the Fit of Item Response Models
ERIC Educational Resources Information Center
Liang, Tie; Wells, Craig S.; Hambleton, Ronald K.
2014-01-01
As item response theory has been more widely applied, investigating the fit of a parametric model becomes an important part of the measurement process. There is a lack of promising solutions to the detection of model misfit in IRT. Douglas and Cohen introduced a general nonparametric approach, RISE (Root Integrated Squared Error), for detecting…
NASA Astrophysics Data System (ADS)
Bessagnet, Bertrand; Pirovano, Guido; Mircea, Mihaela; Cuvelier, Cornelius; Aulinger, Armin; Calori, Giuseppe; Ciarelli, Giancarlo; Manders, Astrid; Stern, Rainer; Tsyro, Svetlana; García Vivanco, Marta; Thunis, Philippe; Pay, Maria-Teresa; Colette, Augustin; Couvidat, Florian; Meleux, Frédérik; Rouïl, Laurence; Ung, Anthony; Aksoyoglu, Sebnem; María Baldasano, José; Bieser, Johannes; Briganti, Gino; Cappelletti, Andrea; D'Isidoro, Massimo; Finardi, Sandro; Kranenburg, Richard; Silibello, Camillo; Carnevale, Claudio; Aas, Wenche; Dupont, Jean-Charles; Fagerli, Hilde; Gonzalez, Lucia; Menut, Laurent; Prévôt, André S. H.; Roberts, Pete; White, Les
2016-10-01
The EURODELTA III exercise has facilitated a comprehensive intercomparison and evaluation of chemistry transport model performances. Participating models performed calculations for four 1-month periods in different seasons in the years 2006 to 2009, allowing the influence of different meteorological conditions on model performances to be evaluated. The exercise was performed with strict requirements for the input data, with few exceptions. As a consequence, most of differences in the outputs will be attributed to the differences in model formulations of chemical and physical processes. The models were evaluated mainly for background rural stations in Europe. The performance was assessed in terms of bias, root mean square error and correlation with respect to the concentrations of air pollutants (NO2, O3, SO2, PM10 and PM2.5), as well as key meteorological variables. Though most of meteorological parameters were prescribed, some variables like the planetary boundary layer (PBL) height and the vertical diffusion coefficient were derived in the model preprocessors and can partly explain the spread in model results. In general, the daytime PBL height is underestimated by all models. The largest variability of predicted PBL is observed over the ocean and seas. For ozone, this study shows the importance of proper boundary conditions for accurate model calculations and then on the regime of the gas and particle chemistry. The models show similar and quite good performance for nitrogen dioxide, whereas they struggle to accurately reproduce measured sulfur dioxide concentrations (for which the agreement with observations is the poorest). In general, the models provide a close-to-observations map of particulate matter (PM2.5 and PM10) concentrations over Europe rather with correlations in the range 0.4-0.7 and a systematic underestimation reaching -10 µg m-3 for PM10. The highest concentrations are much more underestimated, particularly in wintertime. Further evaluation of the mean diurnal cycles of PM reveals a general model tendency to overestimate the effect of the PBL height rise on PM levels in the morning, while the intensity of afternoon chemistry leads formation of secondary species to be underestimated. This results in larger modelled PM diurnal variations than the observations for all seasons. The models tend to be too sensitive to the daily variation of the PBL. All in all, in most cases model performances are more influenced by the model setup than the season. The good representation of temporal evolution of wind speed is the most responsible for models' skillfulness in reproducing the daily variability of pollutant concentrations (e.g. the development of peak episodes), while the reconstruction of the PBL diurnal cycle seems to play a larger role in driving the corresponding pollutant diurnal cycle and hence determines the presence of systematic positive and negative biases detectable on daily basis.
Nuthmann, Antje; Einhäuser, Wolfgang; Schütz, Immo
2017-01-01
Since the turn of the millennium, a large number of computational models of visual salience have been put forward. How best to evaluate a given model's ability to predict where human observers fixate in images of real-world scenes remains an open research question. Assessing the role of spatial biases is a challenging issue; this is particularly true when we consider the tendency for high-salience items to appear in the image center, combined with a tendency to look straight ahead ("central bias"). This problem is further exacerbated in the context of model comparisons, because some-but not all-models implicitly or explicitly incorporate a center preference to improve performance. To address this and other issues, we propose to combine a-priori parcellation of scenes with generalized linear mixed models (GLMM), building upon previous work. With this method, we can explicitly model the central bias of fixation by including a central-bias predictor in the GLMM. A second predictor captures how well the saliency model predicts human fixations, above and beyond the central bias. By-subject and by-item random effects account for individual differences and differences across scene items, respectively. Moreover, we can directly assess whether a given saliency model performs significantly better than others. In this article, we describe the data processing steps required by our analysis approach. In addition, we demonstrate the GLMM analyses by evaluating the performance of different saliency models on a new eye-tracking corpus. To facilitate the application of our method, we make the open-source Python toolbox "GridFix" available.
Kazadi Mbamba, Christian; Flores-Alsina, Xavier; John Batstone, Damien; Tait, Stephan
2016-09-01
The focus of modelling in wastewater treatment is shifting from single unit to plant-wide scale. Plant-wide modelling approaches provide opportunities to study the dynamics and interactions of different transformations in water and sludge streams. Towards developing more general and robust simulation tools applicable to a broad range of wastewater engineering problems, this paper evaluates a plant-wide model built with sub-models from the Benchmark Simulation Model No. 2-P (BSM2-P) with an improved/expanded physico-chemical framework (PCF). The PCF includes a simple and validated equilibrium approach describing ion speciation and ion pairing with kinetic multiple minerals precipitation. Model performance is evaluated against data sets from a full-scale wastewater treatment plant, assessing capability to describe water and sludge lines across the treatment process under steady-state operation. With default rate kinetic and stoichiometric parameters, a good general agreement is observed between the full-scale datasets and the simulated results under steady-state conditions. Simulation results show differences between measured and modelled phosphorus as little as 4-15% (relative) throughout the entire plant. Dynamic influent profiles were generated using a calibrated influent generator and were used to study the effect of long-term influent dynamics on plant performance. Model-based analysis shows that minerals precipitation strongly influences composition in the anaerobic digesters, but also impacts on nutrient loading across the entire plant. A forecasted implementation of nutrient recovery by struvite crystallization (model scenario only), reduced the phosphorus content in the treatment plant influent (via centrate recycling) considerably and thus decreased phosphorus in the treated outflow by up to 43%. Overall, the evaluated plant-wide model is able to jointly describe the physico-chemical and biological processes, and is advocated for future use as a tool for design, performance evaluation and optimization of whole wastewater treatment plants. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Smith, O. E.; Adelfang, S. I.; Tubbs, J. D.
1982-01-01
A five-parameter gamma distribution (BGD) having two shape parameters, two location parameters, and a correlation parameter is investigated. This general BGD is expressed as a double series and as a single series of the modified Bessel function. It reduces to the known special case for equal shape parameters. Practical functions for computer evaluations for the general BGD and for special cases are presented. Applications to wind gust modeling for the ascent flight of the space shuttle are illustrated.
Explicit robust schemes for implementation of general principal value-based constitutive models
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Saleeb, A. F.; Tan, H. Q.; Zhang, Y.
1993-01-01
The issue of developing effective and robust schemes to implement general hyperelastic constitutive models is addressed. To this end, special purpose functions are used to symbolically derive, evaluate, and automatically generate the associated FORTRAN code for the explicit forms of the corresponding stress function and material tangent stiffness tensors. These explicit forms are valid for the entire deformation range. The analytical form of these explicit expressions is given here for the case in which the strain-energy potential is taken as a nonseparable polynomial function of the principle stretches.
On the sensitivity of mesoscale models to surface-layer parameterization constants
NASA Astrophysics Data System (ADS)
Garratt, J. R.; Pielke, R. A.
1989-09-01
The Colorado State University standard mesoscale model is used to evaluate the sensitivity of one-dimensional (1D) and two-dimensional (2D) fields to differences in surface-layer parameterization “constants”. Such differences reflect the range in the published values of the von Karman constant, Monin-Obukhov stability functions and the temperature roughness length at the surface. The sensitivity of 1D boundary-layer structure, and 2D sea-breeze intensity, is generally less than that found in published comparisons related to turbulence closure schemes generally.
Consolidation of data base for Army generalized missile model
NASA Technical Reports Server (NTRS)
Klenke, D. J.; Hemsch, M. J.
1980-01-01
Data from plume interaction tests, nose mounted canard configuration tests, and high angle of attack tests on the Army Generalized Missile model are consolidated in a computer program which makes them readily accessible for plotting, listing, and evaluation. The program is written in FORTRAN and will run on an ordinary minicomputer. It has the capability of retrieving any coefficient from the existing DATAMAN tapes and displaying it in tabular or plotted form. Comparisons of data taken in several wind tunnels and of data with the predictions of Program MISSILE2 are also presented.
Luo, Lingyun; Tong, Ling; Zhou, Xiaoxi; Mejino, Jose L V; Ouyang, Chunping; Liu, Yongbin
2017-11-01
Organizing the descendants of a concept under a particular semantic relationship may be rather arbitrarily carried out during the manual creation processes of large biomedical terminologies, resulting in imbalances in relationship granularity. This work aims to propose scalable models towards systematically evaluating the granularity balance of semantic relationships. We first utilize "parallel concepts set (PCS)" and two features (the length and the strength) of the paths between PCSs to design the general evaluation models, based on which we propose eight concrete evaluation models generated by two specific types of PCSs: single concept set and symmetric concepts set. We then apply those concrete models to the IS-A relationship in FMA and SNOMED CT's Body Structure subset, as well as to the Part-Of relationship in FMA. Moreover, without loss of generality, we conduct two additional rounds of applications on the Part-Of relationship after removing length redundancies and strength redundancies sequentially. At last, we perform automatic evaluation on the imbalances detected after the final round for identifying missing concepts, misaligned relations and inconsistencies. For the IS-A relationship, 34 missing concepts, 80 misalignments and 18 redundancies in FMA as well as 28 missing concepts, 114 misalignments and 1 redundancy in SNOMED CT were uncovered. In addition, 6,801 instances of imbalances for the Part-Of relationship in FMA were also identified, including 3,246 redundancies. After removing those redundancies from FMA, the total number of Part-Of imbalances was dramatically reduced to 327, including 51 missing concepts, 294 misaligned relations, and 36 inconsistencies. Manual curation performed by the FMA project leader confirmed the effectiveness of our method in identifying curation errors. In conclusion, the granularity balance of hierarchical semantic relationship is a valuable property to check for ontology quality assurance, and the scalable evaluation models proposed in this study are effective in fulfilling this task, especially in auditing relationships with sub-hierarchies, such as the seldom evaluated Part-Of relationship. Copyright © 2017 Elsevier Inc. All rights reserved.
Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael
2011-01-01
Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Platania, P., E-mail: platania@ifp.cnr.it; Figini, L.; Farina, D.
The purpose of this work is the optical modeling and physical performances evaluations of the JT-60SA ECRF launcher system. The beams have been simulated with the electromagnetic code GRASP® and used as input for ECCD calculations performed with the beam tracing code GRAY, capable of modeling propagation, absorption and current drive of an EC Gaussion beam with general astigmatism. Full details of the optical analysis has been taken into account to model the launched beams. Inductive and advanced reference scenarios has been analysed for physical evaluations in the full poloidal and toroidal steering ranges for two slightly different layouts ofmore » the launcher system.« less
Translational Animal Models of Atopic Dermatitis for Preclinical Studies
Martel, Britta C.; Lovato, Paola; Bäumer, Wolfgang; Olivry, Thierry
2017-01-01
There is a medical need to develop new treatments for patients suffering from atopic dermatitis (AD). To improve the discovery and testing of novel treatments, relevant animal models for AD are needed. Generally, these animal models mimic different aspects of the pathophysiology of human AD, such as skin barrier defects and Th2 immune bias with additional Th1 and Th22, and in some populations Th17, activation. However, the pathomechanistic characterization and pharmacological validation of these animal models are generally incomplete. In this paper, we review animal models of AD in the context of preclinical use and their possible translation to the human disease. Most of these models use mice, but we will also critically evaluate dog models of AD, as increasing information on disease mechanism show their likely relevance for the human disease. PMID:28955179
PAN AIR modeling studies. [higher order panel method for aircraft design
NASA Technical Reports Server (NTRS)
Towne, M. C.; Strande, S. M.; Erickson, L. L.; Kroo, I. M.; Enomoto, F. Y.; Carmichael, R. L.; Mcpherson, K. F.
1983-01-01
PAN AIR is a computer program that predicts subsonic or supersonic linear potential flow about arbitrary configurations. The code's versatility and generality afford numerous possibilities for modeling flow problems. Although this generality provides great flexibility, it also means that studies are required to establish the dos and don'ts of modeling. The purpose of this paper is to describe and evaluate a variety of methods for modeling flows with PAN AIR. The areas discussed are effects of panel density, internal flow modeling, forebody modeling in subsonic flow, propeller slipstream modeling, effect of wake length, wing-tail-wake interaction, effect of trailing-edge paneling on the Kutta condition, well- and ill-posed boundary-value problems, and induced-drag calculations. These nine topics address problems that are of practical interest to the users of PAN AIR.
Best Statistical Distribution of flood variables for Johor River in Malaysia
NASA Astrophysics Data System (ADS)
Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.
2012-12-01
A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow
Evaluation of Generation Alternation Models in Evolutionary Robotics
NASA Astrophysics Data System (ADS)
Oiso, Masashi; Matsumura, Yoshiyuki; Yasuda, Toshiyuki; Ohkura, Kazuhiro
For efficient implementation of Evolutionary Algorithms (EA) to a desktop grid computing environment, we propose a new generation alternation model called Grid-Oriented-Deletion (GOD) based on comparison with the conventional techniques. In previous research, generation alternation models are generally evaluated by using test functions. However, their exploration performance on the real problems such as Evolutionary Robotics (ER) has not been made very clear yet. Therefore we investigate the relationship between the exploration performance of EA on an ER problem and its generation alternation model. We applied four generation alternation models to the Evolutionary Multi-Robotics (EMR), which is the package-pushing problem to investigate their exploration performance. The results show that GOD is more effective than the other conventional models.
Chronic heart failure management in Australia -- time for general practice centred models of care?
Scott, Ian; Jackson, Claire
2013-05-01
Chronic heart failure (CHF) is an increasingly prevalent problem within ageing populations and accounts for thousands of hospitalisations and deaths annually in Australia. Disease management programs for CHF (CHF-DMPs) aim to optimise care, with the predominant model being cardiologist led, hospital based multidisciplinary clinics with cardiac nurse outreach. However, findings from contemporary observational studies and clinical trials raise uncertainty around the effectiveness and sustainability of traditional CHF-DMPs in real-world clinical practice. To suggest an alternative model of care that involves general practitioners with a special interest in CHF liaising with, and being up-skilled by, specialists within community based, multidisciplinary general practice settings. Preliminary data from trials evaluating primary care based CHF-DMPs are encouraging, and further studies are underway comparing this model of care with traditional hospital based, specialist led CHF-DMPs. Results of studies of similar primary care models targeting diabetes and other chronic diseases suggest potential for its application to CHF.
ERIC Educational Resources Information Center
Sideridis, Georgios D.; Tsaousis, Ioannis; Al-harbi, Khaleel A.
2015-01-01
The purpose of the present study was to extend the model of measurement invariance by simultaneously estimating invariance across multiple populations in the dichotomous instrument case using multi-group confirmatory factor analytic and multiple indicator multiple causes (MIMIC) methodologies. Using the Arabic version of the General Aptitude Test…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feijoo, M.; Mestre, F.; Castagnaro, A.
This study evaluates the potential effect of climate change on Dry-bean production in Argentina, combining climate models, a crop productivity model and a yield response model estimation of climate variables on crop yields. The study was carried out in the North agricultural regions of Jujuy, Salta, Santiago del Estero and Tucuman which include the largest areas of Argentina where dry beans are grown as a high input crop. The paper combines the output from a crop model with different techniques of analysis. The scenarios used in this study were generated from the output of two General Circulation Models (GCMs): themore » Goddard Institute for Space Studies model (GISS) and the Canadian Climate Change Model (CCCM). The study also includes a preliminary evaluation of the potential changes in monetary returns taking into account the possible variability of yields and prices, using mean-Gini stochastic dominance (MGSD). The results suggest that large climate change may have a negative impact on the Argentine agriculture sector, due to the high relevance of this product in the export sector. The difference negative effect depends on the varieties of dry bean and also the General Circulation Model scenarios considered for double levels of atmospheric carbon dioxide.« less
2013-01-01
Background The present study aimed to develop an artificial neural network (ANN) based prediction model for cardiovascular autonomic (CA) dysfunction in the general population. Methods We analyzed a previous dataset based on a population sample consisted of 2,092 individuals aged 30–80 years. The prediction models were derived from an exploratory set using ANN analysis. Performances of these prediction models were evaluated in the validation set. Results Univariate analysis indicated that 14 risk factors showed statistically significant association with CA dysfunction (P < 0.05). The mean area under the receiver-operating curve was 0.762 (95% CI 0.732–0.793) for prediction model developed using ANN analysis. The mean sensitivity, specificity, positive and negative predictive values were similar in the prediction models was 0.751, 0.665, 0.330 and 0.924, respectively. All HL statistics were less than 15.0. Conclusion ANN is an effective tool for developing prediction models with high value for predicting CA dysfunction among the general population. PMID:23902963
NASA Technical Reports Server (NTRS)
Douglass, Anne R.; Stolarski, Richard S.; Steenrod, Steven; Pawson, Steven
2003-01-01
One key application of atmospheric chemistry and transport models is prediction of the response of ozone and other constituents to various natural and anthropogenic perturbations. These include changes in composition, such as the previous rise and recent decline in emission of man-made chlorofluorcarbons, changes in aerosol loading due to volcanic eruption, and changes in solar forcing. Comparisons of hindcast model results for the past few decades with observations are a key element of model evaluation and provide a sense of the reliability of model predictions. The 25 year data set from Total Ozone Mapping Spectrometers is a cornerstone of such model evaluation. Here we report evaluation of three-dimensional multi-decadal simulation of stratospheric composition. Meteorological fields for this off-line calculation are taken from a 50 year simulation of a general circulation model. Model fields are compared with observations from TOMS and also with observations from the Stratospheric Aerosol and Gas Experiment (SAGE), Microwave Limb Sounder (MLS), Cryogenic Limb Array Etalon Spectrometer (CLAES), and the Halogen Occultation Experiment (HALOE). This overall evaluation will emphasize the spatial, seasonal, and interannual variability of the simulation compared with observed atmospheric variability.
NASA Astrophysics Data System (ADS)
Bourqui, M.; Charriere, M. K. M.; Bolduc, C.
2016-12-01
This talk presents a case of a learning-by-doing approach used by the Climanosco organisation to produce research-based information written in a language accessible to a large public. In this model, engagement (the "doing") of members of the general public, alongside climate scientists, is fostered at various levels of this production of knowledge. In particular, this engagement plays a key role in our extended peer-review process as non-scientific referees are requested to review the accessibility of manuscripts for a large public. Members of the general public also participate to the scientific inquiry by inviting scientists to write on a particular topic or by co-authoring articles. Importantly, their participation, side-by-side with climate scientists, allows them to naturally raise their climate literacy (the "learning"). This model was tested in the context of a scientific challenge organised for the launch of Climanosco where climate scientists were invited to re-frame their research for the general public. This competition started in the fall 2015 and is due to end in September 2016. It led to 11 published articles and engaged the participation of 24 members of the general public. Six non-scientists participated to the jury alongside six climate scientists and evaluated the 11 articles. Their perceived increase in climate knowledge, as evaluated though a survey, will be presented in this talk. One important challenge now is to evaluate the potential of this model to support the teaching of climate sciences at schools. For that purpose, we are starting a dialog with various teachers in several countries. Progresses on this side will also be discussed in this talk.
The modelling of heat, mass and solute transport in solidification systems
NASA Technical Reports Server (NTRS)
Voller, V. R.; Brent, A. D.; Prakash, C.
1989-01-01
The aim of this paper is to explore the range of possible one-phase models of binary alloy solidification. Starting from a general two-phase description, based on the two-fluid model, three limiting cases are identified which result in one-phase models of binary systems. Each of these models can be readily implemented in standard single phase flow numerical codes. Differences between predictions from these models are examined. In particular, the effects of the models on the predicted macro-segregation patterns are evaluated.
Yap, Keong; Gibbs, Amy L; Francis, Andrew J P; Schuster, Sharynn E
2016-01-01
The Bivalent Fear of Evaluation (BFOE) model of social anxiety proposes that fear of negative evaluation (FNE) and fear of positive evaluation (FPE) play distinct roles in social anxiety. Research is however lacking in terms of how FPE is related to perfectionism and how these constructs interact to predict social anxiety. Participants were 382 individuals from the general community and included an oversampling of individuals with social anxiety. Measures of FPE, FNE, perfectionism, and social anxiety were administered. Results were mostly consistent with the predictions made by the BFOE model and showed that accounting for confounding variables, FPE correlated negatively with high standards but positively with maladaptive perfectionism. FNE was also positively correlated with maladaptive perfectionism, but there was no significant relationship between FNE and high standards. Also consistent with BFOE model, both FNE and FPE significantly moderated the relationship between maladaptive perfectionism and social anxiety with the relationship strengthened at high levels of FPE and FNE. These findings provide additional support for the BFOE model and implications are discussed.
Saat, Mohd Rapik; Barkan, Christopher P L
2011-05-15
North America railways offer safe and generally the most economical means of long distance transport of hazardous materials. Nevertheless, in the event of a train accident releases of these materials can pose substantial risk to human health, property or the environment. The majority of railway shipments of hazardous materials are in tank cars. Improving the safety design of these cars to make them more robust in accidents generally increases their weight thereby reducing their capacity and consequent transportation efficiency. This paper presents a generalized tank car safety design optimization model that addresses this tradeoff. The optimization model enables evaluation of each element of tank car safety design, independently and in combination with one another. We present the optimization model by identifying a set of Pareto-optimal solutions for a baseline tank car design in a bicriteria decision problem. This model provides a quantitative framework for a rational decision-making process involving tank car safety design enhancements to reduce the risk of transporting hazardous materials. Copyright © 2011 Elsevier B.V. All rights reserved.
Abstraction and model evaluation in category learning.
Vanpaemel, Wolf; Storms, Gert
2010-05-01
Thirty previously published data sets, from seminal category learning tasks, are reanalyzed using the varying abstraction model (VAM). Unlike a prototype-versus-exemplar analysis, which focuses on extreme levels of abstraction only, a VAM analysis also considers the possibility of partial abstraction. Whereas most data sets support no abstraction when only the extreme possibilities are considered, we show that evidence for abstraction can be provided using the broader view on abstraction provided by the VAM. The present results generalize earlier demonstrations of partial abstraction (Vanpaemel & Storms, 2008), in which only a small number of data sets was analyzed. Following the dominant modus operandi in category learning research, Vanpaemel and Storms evaluated the models on their best fit, a practice known to ignore the complexity of the models under consideration. In the present study, in contrast, model evaluation not only relies on the maximal likelihood, but also on the marginal likelihood, which is sensitive to model complexity. Finally, using a large recovery study, it is demonstrated that, across the 30 data sets, complexity differences between the models in the VAM family are small. This indicates that a (computationally challenging) complexity-sensitive model evaluation method is uncalled for, and that the use of a (computationally straightforward) complexity-insensitive model evaluation method is justified.
Continuum-mechanics-based rheological formulation for debris flow
Chen, Cheng-lung; Ling, Chi-Hai; ,
1993-01-01
This paper aims to assess the validity of the generalized viscoplastic fluid (GVF) model in the light of both the classical relative-viscosity versus concentration relation and the dimensionless stress versus shear-rate squared relations based on kinetic theory, thereby addressing how to evaluate the rheological parameters of the GVF model using Bagnold's data.
A General Set of Procedures for Constructivist Instructional Design: The New R2D2 Model.
ERIC Educational Resources Information Center
Willis, Jerry; Wright, Kristen Egeland
2000-01-01
Describes the R2D2 (Reflective, Recursive Design and Development) model of constructivist instructional design. Highlights include participatory teams; progressive problem solution; phronesis, or contextual understanding; dissemination, including summative evaluation; and a new paradigm that shifts from the industrial age to the information age.…
A General Multivariate Latent Growth Model with Applications to Student Achievement
ERIC Educational Resources Information Center
Bianconcini, Silvia; Cagnone, Silvia
2012-01-01
The evaluation of the formative process in the University system has been assuming an ever increasing importance in the European countries. Within this context, the analysis of student performance and capabilities plays a fundamental role. In this work, the authors propose a multivariate latent growth model for studying the performances of a…
ERIC Educational Resources Information Center
Dalton, William Edward
Described is a project designed to make government lessons and economics more appealing to sixth-grade students by having them set up and run a model city. General preparation procedures and set-up of the project, specific lesson plans, additional activities, and project evaluation are examined. An actual 3-dimensional model city was set up on…
Studios Abroad: A Challenge in Innovative Pedagogy
ERIC Educational Resources Information Center
Macedo, Joseli
2017-01-01
Study abroad programs offer a unique opportunity to evaluate pedagogic models. The role of studios in design and planning pedagogy has been examined. However, how the general framework of a studio supports other pedagogic models has not been widely discussed. This article assesses a series of urban planning and design studios conducted abroad to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grove, John W.
2016-08-16
The xRage code supports a variety of hydrodynamic equation of state (EOS) models. In practice these are generally accessed in the executing code via a pressure-temperature based table look up. This document will describe the various models supported by these codes and provide details on the algorithms used to evaluate the equation of state.
ERIC Educational Resources Information Center
Raykov, Tenko
2011-01-01
Interval estimation of intraclass correlation coefficients in hierarchical designs is discussed within a latent variable modeling framework. A method accomplishing this aim is outlined, which is applicable in two-level studies where participants (or generally lower-order units) are clustered within higher-order units. The procedure can also be…
Assessing the New Competencies for Resident Education: A Model from an Emergency Medicine Program.
ERIC Educational Resources Information Center
Reisdorff, Earl J.; Hayes, Oliver W.; Carlson, Dale J.; Walker, Gregory L.
2001-01-01
Based on the experience of Michigan State University's emergency medicine residency program, proposes a practical method for modifying an existing student evaluation format. The model provides a template other programs could use in assessing residents' acquisition of the knowledge, skills, and attitudes reflected in the six general competencies…
A Digital Computer Simulation of Cardiovascular and Renal Physiology.
ERIC Educational Resources Information Center
Tidball, Charles S.
1979-01-01
Presents the physiological MACPEE, one of a family of digital computer simulations used in Canada and Great Britain. A general description of the model is provided, along with a sample of computer output format, options for making interventions, advanced capabilities, an evaluation, and technical information for running a MAC model. (MA)
Derivation and definition of a linear aircraft model
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1988-01-01
A linear aircraft model for a rigid aircraft of constant mass flying over a flat, nonrotating earth is derived and defined. The derivation makes no assumptions of reference trajectory or vehicle symmetry. The linear system equations are derived and evaluated along a general trajectory and include both aircraft dynamics and observation variables.
Bayesian Models for Astrophysical Data Using R, JAGS, Python, and Stan
NASA Astrophysics Data System (ADS)
Hilbe, Joseph M.; de Souza, Rafael S.; Ishida, Emille E. O.
2017-05-01
This comprehensive guide to Bayesian methods in astronomy enables hands-on work by supplying complete R, JAGS, Python, and Stan code, to use directly or to adapt. It begins by examining the normal model from both frequentist and Bayesian perspectives and then progresses to a full range of Bayesian generalized linear and mixed or hierarchical models, as well as additional types of models such as ABC and INLA. The book provides code that is largely unavailable elsewhere and includes details on interpreting and evaluating Bayesian models. Initial discussions offer models in synthetic form so that readers can easily adapt them to their own data; later the models are applied to real astronomical data. The consistent focus is on hands-on modeling, analysis of data, and interpretations that address scientific questions. A must-have for astronomers, its concrete approach will also be attractive to researchers in the sciences more generally.
Evaluation of an urban land surface scheme over a tropical suburban neighborhood
NASA Astrophysics Data System (ADS)
Harshan, Suraj; Roth, Matthias; Velasco, Erik; Demuzere, Matthias
2017-07-01
The present study evaluates the performance of the SURFEX (TEB/ISBA) urban land surface parametrization scheme in offline mode over a suburban area of Singapore. Model performance (diurnal and seasonal characteristics) is investigated using measurements of energy balance fluxes, surface temperatures of individual urban facets, and canyon air temperature collected during an 11-month period. Model performance is best for predicting net radiation and sensible heat fluxes (both are slightly overpredicted during daytime), but weaker for latent heat (underpredicted during daytime) and storage heat fluxes (significantly underpredicted daytime peaks and nighttime storage). Daytime surface temperatures are generally overpredicted, particularly those containing horizontal surfaces such as roofs and roads. This result, together with those for the storage heat flux, point to the need for a better characterization of the thermal and radiative characteristics of individual urban surface facets in the model. Significant variation exists in model behavior between dry and wet seasons, the latter generally being better predicted. The simple vegetation parametrization used is inadequate to represent seasonal moisture dynamics, sometimes producing unrealistically dry conditions.
NASA Astrophysics Data System (ADS)
Albertson, C. W.
1982-03-01
A 1/12th scale model of the Curved Surface Test Apparatus (CSTA), which will be used to study aerothermal loads and evaluate Thermal Protection Systems (TPS) on a fuselage-type configuration in the Langley 8-Foot High Temperature Structures Tunnel (8 ft HTST), was tested in the Langley 7-Inch Mach 7 Pilot Tunnel. The purpose of the tests was to study the overall flow characteristics and define an envelope for testing the CSTA in the 8 ft HTST. Wings were tested on the scaled CSTA model to select a wing configuration with the most favorable characteristics for conducting TPS evaluations for curved and intersecting surfaces. The results indicate that the CSTA and selected wing configuration can be tested at angles of attack up to 15.5 and 10.5 degrees, respectively. The base pressure for both models was at the expected low level for most test conditions. Results generally indicate that the CSTA and wing configuration will provide a useful test bed for aerothermal pads and thermal structural concept evaluation over a broad range of flow conditions in the 8 ft HTST.
NASA Technical Reports Server (NTRS)
Albertson, C. W.
1982-01-01
A 1/12th scale model of the Curved Surface Test Apparatus (CSTA), which will be used to study aerothermal loads and evaluate Thermal Protection Systems (TPS) on a fuselage-type configuration in the Langley 8-Foot High Temperature Structures Tunnel (8 ft HTST), was tested in the Langley 7-Inch Mach 7 Pilot Tunnel. The purpose of the tests was to study the overall flow characteristics and define an envelope for testing the CSTA in the 8 ft HTST. Wings were tested on the scaled CSTA model to select a wing configuration with the most favorable characteristics for conducting TPS evaluations for curved and intersecting surfaces. The results indicate that the CSTA and selected wing configuration can be tested at angles of attack up to 15.5 and 10.5 degrees, respectively. The base pressure for both models was at the expected low level for most test conditions. Results generally indicate that the CSTA and wing configuration will provide a useful test bed for aerothermal pads and thermal structural concept evaluation over a broad range of flow conditions in the 8 ft HTST.
Where are the food animal veterinarian shortage areas anyway?
Wang, Tong; Hennessy, David A; O'Connor, Annette M
2012-05-01
In 2010 the United States implemented the Veterinary Medicine Loan Repayment Program (VMLRP) to address perceived regional shortages in certain veterinary occupations, including food animal practice. With county as the unit of analysis, this paper describes a pair of models to evaluate factors associated with being designated a private practice shortage area in 2010. One model is used to explain food animal veterinarian location choices so as to provide an objective evaluation of comparative shortage. The other model seeks to explain the counties chosen as shortage areas. Model results are then used to evaluate the program. On the whole the program appears to perform quite well. For several states, however, VMLRP shortage designations are inconsistent with the food animal veterinarian location model. Comparative shortage is generally more severe in states that have no VMLRP designated private practice shortage counties than in states that do. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Neggers, Roel
2016-04-01
Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach), and iii) process-level evaluation at climate time-scales. The advantages and disadvantages of each approach will be identified and discussed, and some thoughts about possible future developments will be given.
VISCOPLASTIC FLUID MODEL FOR DEBRIS FLOW ROUTING.
Chen, Cheng-lung
1986-01-01
This paper describes how a generalized viscoplastic fluid model, which was developed based on non-Newtonian fluid mechanics, can be successfully applied to routing a debris flow down a channel. The one-dimensional dynamic equations developed for unsteady clear-water flow can be used for debris flow routing if the flow parameters, such as the momentum (or energy) correction factor and the resistance coefficient, can be accurately evaluated. The writer's generalized viscoplastic fluid model can be used to express such flow parameters in terms of the rheological parameters for debris flow in wide channels. A preliminary analysis of the theoretical solutions reveals the importance of the flow behavior index and the so-called modified Froude number for uniformly progressive flow in snout profile modeling.
Dynamics of a prey-predator system under Poisson white noise excitation
NASA Astrophysics Data System (ADS)
Pan, Shan-Shan; Zhu, Wei-Qiu
2014-10-01
The classical Lotka-Volterra (LV) model is a well-known mathematical model for prey-predator ecosystems. In the present paper, the pulse-type version of stochastic LV model, in which the effect of a random natural environment has been modeled as Poisson white noise, is investigated by using the stochastic averaging method. The averaged generalized Itô stochastic differential equation and Fokker-Planck-Kolmogorov (FPK) equation are derived for prey-predator ecosystem driven by Poisson white noise. Approximate stationary solution for the averaged generalized FPK equation is obtained by using the perturbation method. The effect of prey self-competition parameter ɛ2 s on ecosystem behavior is evaluated. The analytical result is confirmed by corresponding Monte Carlo (MC) simulation.
Impact of lakes and wetlands on present and future boreal climate
NASA Astrophysics Data System (ADS)
Poutou, E.; Krinner, G.; Genthon, C.
2002-12-01
Impact of lakes and wetlands on present and future boreal climate The role of lakes and wetlands in present-day high latitude climate is quantified using a general circulation model of the atmosphere. The atmospheric model includes a lake module which is presented and validated. Seasonal and spatial wetland distribution is calculated as a function of the hydrological budget of the wetlands themselves and of continental soil whose runoff feeds them. Wetland extent is simulated and discussed both in simulations forced by observed climate and in general circulation model simulations. In off-line simulations, forced by ECMWF reanalyses, the lake model simulates correctly observed lake ice durations, while the wetland extent is somewhat underestimated in the boreal regions. Coupled to the general circulation model, the lake model yields satisfying ice durations, although the climate model biases have impacts on the modeled lake ice conditions. Boreal wetland extents are overestimated in the general circulation model as simulated precipitation is too high. The impact of inundated surfaces on the simulated climate is strongest in summer when these surfaces are ice-free. Wetlands seem to play a more important role than lakes in cooling the boreal regions in summer and in humidifying the atmosphere. The role of lakes and wetlands in future climate change is evaluated by analyzing simulations of present and future climate with and without prescribed inland water bodies.
Results of the Housing Building Condition Evaluation Survey at the University of Georgia.
ERIC Educational Resources Information Center
Casey, John M.
A complete campus building condition evaluation survey was conducted at the University of Georgia in 1989 and results for the housing department were analyzed. The survey design was based on a model developed by Harlan Bareither at the University of Illinois that separates building deficiencies into seven general headings. Data were collected at…
NASA Astrophysics Data System (ADS)
Shen, Hong; Liu, Wen-xing; Zhou, Xue-yun; Zhou, Li-ling; Yu, Long-Kun
2018-02-01
In order to thoroughly understand the characteristics of the aperture-averaging effect of atmospheric scintillation in terrestrial optical wireless communication and provide references for engineering design and performance evaluation of the optics system employed in the atmosphere, we have theoretically deduced the generally analytic expression of the aperture-averaging factor of atmospheric scintillation, and numerically investigated characteristics of the apertureaveraging factor under different propagation conditions. The limitations of the current commonly used approximate calculation formula of aperture-averaging factor have been discussed, and the results showed that the current calculation formula is not applicable for the small receiving aperture under non-uniform turbulence link. Numerical calculation has showed that aperture-averaging factor of atmospheric scintillation presented an exponential decline model for the small receiving aperture under non-uniform turbulent link, and the general expression of the model was given. This model has certain guiding significance for evaluating the aperture-averaging effect in the terrestrial optical wireless communication.
A computational model of the cognitive impact of decorative elements on the perception of suspense
NASA Astrophysics Data System (ADS)
Delatorre, Pablo; León, Carlos; Gervás, Pablo; Palomo-Duarte, Manuel
2017-10-01
Suspense is a key narrative issue in terms of emotional gratification, influencing the way in which the audience experiences a story. Virtually all narrative media uses suspense as a strategy for reader engagement regardless of the technology or genre. Being such an important narrative component, computational creativity has tackled suspense in a number of automatic storytelling. These systems are mainly based on narrative theories, and in general lack a cognitive approach involving the study of empathy or emotional effect of the environment impact. With this idea in mind, this paper reports on a computational model of the influence of decorative elements on suspense. It has been developed as part of a more general proposal for plot generation based on cognitive aspects. In order to test and parameterise the model, an evaluation based on textual stories and an evaluation based on a 3D virtual environment was run. In both cases, results suggest a direct influence of emotional perception of decorative objects in the suspense of a scene.
NASA Technical Reports Server (NTRS)
Schnell, J. L.; Prather, M. J.; Josse, B.; Naik, V.; Horowitz, L. W.; Cameron-Smith, P.; Bergmann, D.; Zeng, G.; Plummer, D. A.; Sudo, K.;
2015-01-01
We test the current generation of global chemistry-climate models in their ability to simulate observed, present-day surface ozone. Models are evaluated against hourly surface ozone from 4217 stations in North America and Europe that are averaged over 1 degree by 1 degree grid cells, allowing commensurate model-measurement comparison. Models are generally biased high during all hours of the day and in all regions. Most models simulate the shape of regional summertime diurnal and annual cycles well, correctly matching the timing of hourly (approximately 15:00 local time (LT)) and monthly (mid-June) peak surface ozone abundance. The amplitude of these cycles is less successfully matched. The observed summertime diurnal range (25 ppb) is underestimated in all regions by about 7 parts per billion, and the observed seasonal range (approximately 21 parts per billion) is underestimated by about 5 parts per billion except in the most polluted regions, where it is overestimated by about 5 parts per billion. The models generally match the pattern of the observed summertime ozone enhancement, but they overestimate its magnitude in most regions. Most models capture the observed distribution of extreme episode sizes, correctly showing that about 80 percent of individual extreme events occur in large-scale, multi-day episodes of more than 100 grid cells. The models also match the observed linear relationship between episode size and a measure of episode intensity, which shows increases in ozone abundance by up to 6 parts per billion for larger-sized episodes. We conclude that the skill of the models evaluated here provides confidence in their projections of future surface ozone.
Vaughan, Brett
2018-01-01
Clinical teaching evaluations are common in health profession education programs to ensure students are receiving a quality clinical education experience. Questionnaires students use to evaluate their clinical teachers have been developed in professions such as medicine and nursing. The development of a questionnaire that is specifically for the osteopathy on-campus, student-led clinic environment is warranted. Previous work developed the 30-item Osteopathy Clinical Teaching Questionnaire. The current study utilised Rasch analysis to investigate the construct validity of the Osteopathy Clinical Teaching Questionnaire and provide evidence for the validity argument through fit to the Rasch model. Senior osteopathy students at four institutions in Australia, New Zealand and the United Kingdom rated their clinical teachers using the Osteopathy Clinical Teaching Questionnaire. Three hundred and ninety-nine valid responses were received and the data were evaluated for fit to the Rasch model. Reliability estimations (Cronbach's alpha and McDonald's omega) were also evaluated for the final model. The initial analysis demonstrated the data did not fit the Rasch model. Accordingly, modifications to the questionnaire were made including removing items, removing person responses, and rescoring one item. The final model contained 12 items and fit to the Rasch model was adequate. Support for unidimensionality was demonstrated through both the Principal Components Analysis/t-test, and the Cronbach's alpha and McDonald's omega reliability estimates. Analysis of the questionnaire using McDonald's omega hierarchical supported a general factor (quality of clinical teaching in osteopathy). The evidence for unidimensionality and the presence of a general factor support the calculation of a total score for the questionnaire as a sufficient statistic. Further work is now required to investigate the reliability of the 12-item Osteopathy Clinical Teaching Questionnaire to provide evidence for the validity argument.
Bustos-Vázquez, Eduardo; Fernández-Niño, Julián Alfredo; Astudillo-Garcia, Claudia Iveth
2017-04-01
Self-rated health is an individual and subjective conceptualization involving the intersection of biological, social and psychological factors. It provides an invaluable and unique evaluation of a person's general health status. To propose and evaluate a simple conceptual model to understand self-rated health and its relationship to multimorbidity, disability and depressive symptoms in Mexican older adults. We conducted a cross-sectional study based on a national representative sample of 8,874 adults of 60 years of age and older. Self-perception of a positive health status was determined according to a Likert-type scale based on the question: "What do you think is your current health status?" Intermediate variables included multimorbidity, disability and depressive symptoms, as well as dichotomous exogenous variables (sex, having a partner, participation in decision-making and poverty). The proposed conceptual model was validated using a general structural equation model with a logit link function for positive self-rated health. A direct association was found between multimorbidity and positive self-rated health (OR=0.48; 95% CI: 0.42-0.55), disability and positive self-rated health (OR=0.35; 95% CI: 0.30-0.40), depressive symptoms and positive self-rated health (OR=0.38; 95% CI: 0.34-0.43). The model also validated indirect associations between disability and depressive symptoms (OR=2.25; 95% CI: 2.01- 2.52), multimorbidity and depressive symptoms (OR=1.79; 95% CI: 1.61-2.00) and multimorbidity and disability (OR=1.98; 95% CI: 1.78-2.20). A parsimonious theoretical model was empirically evaluated, which enabled identifying direct and indirect associations with positive self-rated health.
Additional Research Needs to Support the GENII Biosphere Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen
In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models.
Automated real time constant-specificity surveillance for disease outbreaks.
Wieland, Shannon C; Brownstein, John S; Berger, Bonnie; Mandl, Kenneth D
2007-06-13
For real time surveillance, detection of abnormal disease patterns is based on a difference between patterns observed, and those predicted by models of historical data. The usefulness of outbreak detection strategies depends on their specificity; the false alarm rate affects the interpretation of alarms. We evaluate the specificity of five traditional models: autoregressive, Serfling, trimmed seasonal, wavelet-based, and generalized linear. We apply each to 12 years of emergency department visits for respiratory infection syndromes at a pediatric hospital, finding that the specificity of the five models was almost always a non-constant function of the day of the week, month, and year of the study (p < 0.05). We develop an outbreak detection method, called the expectation-variance model, based on generalized additive modeling to achieve a constant specificity by accounting for not only the expected number of visits, but also the variance of the number of visits. The expectation-variance model achieves constant specificity on all three time scales, as well as earlier detection and improved sensitivity compared to traditional methods in most circumstances. Modeling the variance of visit patterns enables real-time detection with known, constant specificity at all times. With constant specificity, public health practitioners can better interpret the alarms and better evaluate the cost-effectiveness of surveillance systems.
A Logic Model for Evaluating the Academic Health Department.
Erwin, Paul Campbell; McNeely, Clea S; Grubaugh, Julie H; Valentine, Jennifer; Miller, Mark D; Buchanan, Martha
2016-01-01
Academic Health Departments (AHDs) are collaborative partnerships between academic programs and practice settings. While case studies have informed our understanding of the development and activities of AHDs, there has been no formal published evaluation of AHDs, either singularly or collectively. Developing a framework for evaluating AHDs has potential to further aid our understanding of how these relationships may matter. In this article, we present a general theory of change, in the form of a logic model, for how AHDs impact public health at the community level. We then present a specific example of how the logic model has been customized for a specific AHD. Finally, we end with potential research questions on the AHD based on these concepts. We conclude that logic models are valuable tools, which can be used to assess the value and ultimate impact of the AHD.
NASA Technical Reports Server (NTRS)
Strahan, Susan E.; Douglass, Anne R.; Einaudi, Franco (Technical Monitor)
2001-01-01
The Global Modeling Initiative (GMI) Team developed objective criteria for model evaluation in order to identify the best representation of the stratosphere. This work created a method to quantitatively and objectively discriminate between different models. In the original GMI study, 3 different meteorological data sets were used to run an offline chemistry and transport model (CTM). Observationally-based grading criteria were derived and applied to these simulations and various aspects of stratospheric transport were evaluated; grades were assigned. Here we report on the application of the GMI evaluation criteria to CTM simulations integrated with a new assimilated wind data set and a new general circulation model (GCM) wind data set. The Finite Volume Community Climate Model (FV-CCM) is a new GCM developed at Goddard which uses the NCAR CCM physics and the Lin and Rood advection scheme. The FV-Data Assimilation System (FV-DAS) is a new data assimilation system which uses the FV-CCM as its core model. One year CTM simulations of 2.5 degrees longitude by 2 degrees latitude resolution were run for each wind data set. We present the evaluation of temperature and annual transport cycles in the lower and middle stratosphere in the two new CTM simulations. We include an evaluation of high latitude transport which was not part of the original GMI criteria. Grades for the new simulations will be compared with those assigned during the original GMT evaluations and areas of improvement will be identified.
Berian, Julia R; Zhou, Lynn; Hornor, Melissa A; Russell, Marcia M; Cohen, Mark E; Finlayson, Emily; Ko, Clifford Y; Robinson, Thomas N; Rosenthal, Ronnie A
2017-12-01
Surgical quality datasets can be better tailored toward older adults. The American College of Surgeons (ACS) NSQIP Geriatric Surgery Pilot collected risk factors and outcomes in 4 geriatric-specific domains: cognition, decision-making, function, and mobility. This study evaluated the contributions of geriatric-specific factors to risk adjustment in modeling 30-day outcomes and geriatric-specific outcomes (postoperative delirium, new mobility aid use, functional decline, and pressure ulcers). Using ACS NSQIP Geriatric Surgery Pilot data (January 2014 to December 2016), 7 geriatric-specific risk factors were evaluated for selection in 14 logistic models (morbidities/mortality) in general-vascular and orthopaedic surgery subgroups. Hierarchical models evaluated 4 geriatric-specific outcomes, adjusting for hospitals-level effects and including Bayesian-type shrinkage, to estimate hospital performance. There were 36,399 older adults who underwent operations at 31 hospitals in the ACS NSQIP Geriatric Surgery Pilot. Geriatric-specific risk factors were selected in 10 of 14 models in both general-vascular and orthopaedic surgery subgroups. After risk adjustment, surrogate consent (odds ratio [OR] 1.5; 95% CI 1.3 to 1.8) and use of a mobility aid (OR 1.3; 95% CI 1.1 to 1.4) increased the risk for serious morbidity or mortality in the general-vascular cohort. Geriatric-specific factors were selected in all 4 geriatric-specific outcomes models. Rates of geriatric-specific outcomes were: postoperative delirium in 12.1% (n = 3,650), functional decline in 42.9% (n = 13,000), new mobility aid in 29.7% (n = 9,257), and new or worsened pressure ulcers in 1.7% (n = 527). Geriatric-specific risk factors are important for patient-centered care and contribute to risk adjustment in modeling traditional and geriatric-specific outcomes. To provide optimal patient care for older adults, surgical datasets should collect measures that address cognition, decision-making, mobility, and function. Copyright © 2017 American College of Surgeons. All rights reserved.
NASA Astrophysics Data System (ADS)
Dakhlaoui, H.; Ruelland, D.; Tramblay, Y.; Bargaoui, Z.
2017-07-01
To evaluate the impact of climate change on water resources at the catchment scale, not only future projections of climate are necessary but also robust rainfall-runoff models that must be fairly reliable under changing climate conditions. The aim of this study was thus to assess the robustness of three conceptual rainfall-runoff models (GR4j, HBV and IHACRES) on five basins in northern Tunisia under long-term climate variability, in the light of available future climate scenarios for this region. The robustness of the models was evaluated using a differential split sample test based on a climate classification of the observation period that simultaneously accounted for precipitation and temperature conditions. The study catchments include the main hydrographical basins in northern Tunisia, which produce most of the surface water resources in the country. A 30-year period (1970-2000) was used to capture a wide range of hydro-climatic conditions. The calibration was based on the Kling-Gupta Efficiency (KGE) criterion, while model transferability was evaluated based on the Nash-Sutcliffe efficiency criterion and volume error. The three hydrological models were shown to behave similarly under climate variability. The models simulated the runoff pattern better when transferred to wetter and colder conditions than to drier and warmer ones. It was shown that their robustness became unacceptable when climate conditions involved a decrease of more than 25% in annual precipitation and an increase of more than +1.75 °C in annual mean temperatures. The reduction in model robustness may be partly due to the climate dependence of some parameters. When compared to precipitation and temperature projections in the region, the limits of transferability obtained in this study are generally respected for short and middle term. For long term projections under the most pessimistic emission gas scenarios, the limits of transferability are generally not respected, which may hamper the use of conceptual models for hydrological projections in northern Tunisia.
Zheng, Xueying; Qin, Guoyou; Tu, Dongsheng
2017-05-30
Motivated by the analysis of quality of life data from a clinical trial on early breast cancer, we propose in this paper a generalized partially linear mean-covariance regression model for longitudinal proportional data, which are bounded in a closed interval. Cholesky decomposition of the covariance matrix for within-subject responses and generalized estimation equations are used to estimate unknown parameters and the nonlinear function in the model. Simulation studies are performed to evaluate the performance of the proposed estimation procedures. Our new model is also applied to analyze the data from the cancer clinical trial that motivated this research. In comparison with available models in the literature, the proposed model does not require specific parametric assumptions on the density function of the longitudinal responses and the probability function of the boundary values and can capture dynamic changes of time or other interested variables on both mean and covariance of the correlated proportional responses. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Defibrillator/monitor/pacemakers.
2002-02-01
Defibrillator/monitors allow operators to assess and monitor a patient's ECG and, when necessary, deliver a defibrillating shock to the heart. When integral noninvasive pacing capability is added, the resulting device is referred to as a defibrillator/monitor/pacemaker. In this Update Evaluation, we present our findings for one newly evaluated model, the Philips Heartstream XL, and we summarize our findings for the seven previously evaluated models that are still on the market. (Our previous Evaluations were published in the May-June 1993, February 1998, and September 2000 issues of Health Devices.) Defibrillator/monitor/pacemakers are used for a variety of applications within the hospital, as well as by emergency medical services (EMS) personnel and others in the prehospital environment. To help both hospital-based and prehospital users select an appropriate model, we rate the models (1) for each of three in-hospital applications--general crash-cart use, in-hospital transport use, and in-hospital use by basic as well as advanced users--and (2) for prehospital (EMS) use. For in-hospital use, we recommend four of the evaluated models. These received either Preferred or Acceptable ratings for all the applications considered. For prehospital use, we found that five of the models will meet most organizations' needs.
Toward a Trust Evaluation Mechanism in the Social Internet of Things.
Truong, Nguyen Binh; Lee, Hyunwoo; Askwith, Bob; Lee, Gyu Myoung
2017-06-09
In the blooming era of the Internet of Things (IoT), trust has been accepted as a vital factor for provisioning secure, reliable, seamless communications and services. However, a large number of challenges still remain unsolved due to the ambiguity of the concept of trust as well as the variety of divergent trust models in different contexts. In this research, we augment the trust concept, the trust definition and provide a general conceptual model in the context of the Social IoT (SIoT) environment by breaking down all attributes influencing trust. Then, we propose a trust evaluation model called REK, comprised of the triad of trust indicators (TIs) Reputation, Experience and Knowledge. The REK model covers multi-dimensional aspects of trust by incorporating heterogeneous information from direct observation (as Knowledge TI), personal experiences (as Experience TI) to global opinions (as Reputation TI). The associated evaluation models for the three TIs are also proposed and provisioned. We then come up with an aggregation mechanism for deriving trust values as the final outcome of the REK evaluation model. We believe this article offers better understandings on trust as well as provides several prospective approaches for the trust evaluation in the SIoT environment.
Toward a Trust Evaluation Mechanism in the Social Internet of Things
Truong, Nguyen Binh; Lee, Hyunwoo; Askwith, Bob; Lee, Gyu Myoung
2017-01-01
In the blooming era of the Internet of Things (IoT), trust has been accepted as a vital factor for provisioning secure, reliable, seamless communications and services. However, a large number of challenges still remain unsolved due to the ambiguity of the concept of trust as well as the variety of divergent trust models in different contexts. In this research, we augment the trust concept, the trust definition and provide a general conceptual model in the context of the Social IoT (SIoT) environment by breaking down all attributes influencing trust. Then, we propose a trust evaluation model called REK, comprised of the triad of trust indicators (TIs) Reputation, Experience and Knowledge. The REK model covers multi-dimensional aspects of trust by incorporating heterogeneous information from direct observation (as Knowledge TI), personal experiences (as Experience TI) to global opinions (as Reputation TI). The associated evaluation models for the three TIs are also proposed and provisioned. We then come up with an aggregation mechanism for deriving trust values as the final outcome of the REK evaluation model. We believe this article offers better understandings on trust as well as provides several prospective approaches for the trust evaluation in the SIoT environment. PMID:28598401
Reconstruction of Twist Torque in Main Parachute Risers
NASA Technical Reports Server (NTRS)
Day, Joshua D.
2015-01-01
The reconstruction of twist torque in the Main Parachute Risers of the Capsule Parachute Assembly System (CPAS) has been successfully used to validate CPAS Model Memo conservative twist torque equations. Reconstruction of basic, one degree of freedom drop tests was used to create a functional process for the evaluation of more complex, rigid body simulation. The roll, pitch, and yaw of the body, the fly-out angles of the parachutes, and the relative location of the parachutes to the body are inputs to the torque simulation. The data collected by the Inertial Measurement Unit (IMU) was used to calculate the true torque. The simulation then used photogrammetric and IMU data as inputs into the Model Memo equations. The results were then compared to the true torque results to validate the Model Memo equations. The Model Memo parameters were based off of steel risers and the parameters will need to be re-evaluated for different materials. Photogrammetric data was found to be more accurate than the inertial data in accounting for the relative rotation between payload and cluster. The Model Memo equations were generally a good match and when not matching were generally conservative.
Description and Evaluation of GDEM-V 3.0
2009-02-06
Description and Evaluation of GDEM -V 3.0 Michael R. caRnes Ocean Sciences Branch Oceanography Division February 6, 2009 i REPORT DOCUMENTATION PAGE Form...include area code) b. ABSTRACT c. THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Description and Evaluation of GDEM -V 3.0 Michael R. Carnes...unlimited. Unclassified Unclassified Unclassified UL 24 Michael R. Carnes (228) 688-5648 The GDEM (Generalized Digital Environment Model) has served as
Evaluation of trends in wheat yield models
NASA Technical Reports Server (NTRS)
Ferguson, M. C.
1982-01-01
Trend terms in models for wheat yield in the U.S. Great Plains for the years 1932 to 1976 are evaluated. The subset of meteorological variables yielding the largest adjusted R(2) is selected using the method of leaps and bounds. Latent root regression is used to eliminate multicollinearities, and generalized ridge regression is used to introduce bias to provide stability in the data matrix. The regression model used provides for two trends in each of two models: a dependent model in which the trend line is piece-wise continuous, and an independent model in which the trend line is discontinuous at the year of the slope change. It was found that the trend lines best describing the wheat yields consisted of combinations of increasing, decreasing, and constant trend: four combinations for the dependent model and seven for the independent model.
NASA Astrophysics Data System (ADS)
Miyakawa, Tomoki
2017-04-01
The global cloud/cloud-system resolving model NICAM and its new fully-coupled version NICOCO is run on one of the worlds top-tier supercomputers, the K computer. NICOCO couples the full-3D ocean component COCO of the general circulation model MIROC using a general-purpose coupler Jcup. We carried out multiple MJO simulations using NICAM and the new ocean-coupled version NICOCO to examine their extended-range MJO prediction skills and the impact of ocean coupling. NICAM performs excellently in terms of MJO prediction, maintaining a valid skill up to 27 days after the model is initialized (Miyakawa et al 2014). As is the case in most global models, ocean coupling frees the model from being anchored by the observed SST and allows the model climate to drift away further from reality compared to the atmospheric version of the model. Thus, it is important to evaluate the model bias, and in an initial value problem such as the seasonal extended-range prediction, it is essential to be able to distinguish the actual signal from the early transition of the model from the observed state to its own climatology. Since NICAM is a highly resource-demanding model, evaluation and tuning of the model climatology (order of years) is challenging. Here we focus on the initial 100 days to estimate the early drift of the model, and subsequently evaluate MJO prediction skills of NICOCO. Results show that in the initial 100 days, NICOCO forms a La-Nina like SST bias compared to observation, with a warmer Maritime Continent warm pool and a cooler equatorial central Pacific. The enhanced convection over the Maritime Continent associated with this bias project on to the real-time multi-variate MJO indices (RMM, Wheeler and Hendon 2004), and contaminates the MJO skill score. However, the bias does not appear to demolish the MJO signal severely. The model maintains a valid MJO prediction skill up to nearly 4 weeks when evaluated after linearly removing the early drift component estimated from the 54 simulations. Furthermore, NICOCO outperforms NICAM by far if we focus on events associated with large oceanic signals.
A simple method for assessing occupational exposure via the one-way random effects model.
Krishnamoorthy, K; Mathew, Thomas; Peng, Jie
2016-11-01
A one-way random effects model is postulated for the log-transformed shift-long personal exposure measurements, where the random effect in the model represents an effect due to the worker. Simple closed-form confidence intervals are proposed for the relevant parameters of interest using the method of variance estimates recovery (MOVER). The performance of the confidence bounds is evaluated and compared with those based on the generalized confidence interval approach. Comparison studies indicate that the proposed MOVER confidence bounds are better than the generalized confidence bounds for the overall mean exposure and an upper percentile of the exposure distribution. The proposed methods are illustrated using a few examples involving industrial hygiene data.
Dynamic regulation of erythropoiesis: A computer model of general applicability
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1979-01-01
A mathematical model for the control of erythropoiesis was developed based on the balance between oxygen supply and demand at a renal oxygen detector which controls erythropoietin release and red cell production. Feedback regulation of tissue oxygen tension is accomplished by adjustments of hemoglobin levels resulting from the output of a renal-bone marrow controller. Special consideration was given to the determinants of tissue oxygenation including evaluation of the influence of blood flow, capillary diffusivity, oxygen uptake and oxygen-hemoglobin affinity. A theoretical analysis of the overall control system is presented. Computer simulations of altitude hypoxia, red cell infusion hyperoxia, and homolytic anemia demonstrate validity of the model for general human application in health and disease.
Gonen, Limor Dina
2016-01-01
The objective of this paper was to measure the private and social benefits resulting from technological advances in fertility treatment. An empirical model investigates the willingness-to-pay (WTP) for advances in the medical technology of in vitro fertilization (IVF) among the general public and among IVF patients in Israel. The empirical model's findings demonstrate that IVF patients and the general public value medical technology advances and have a positive WTP for it. The average WTP for IVF technology advances, among IVF patients, is US $3116.9 whereas for the general public it is US$2284.4. Available evidence suggests that advances in medical technology have delivered substantial benefits and appear to have contributed to improved wellbeing.
Schramm, Michael P.; Bevelhimer, Mark; Scherelis, Constantin
2017-02-04
The development of hydrokinetic energy technologies (e.g., tidal turbines) has raised concern over the potential impacts of underwater sound produced by hydrokinetic turbines on fish species likely to encounter these turbines. To assess the potential for behavioral impacts, we exposed four species of fish to varying intensities of recorded hydrokinetic turbine sound in a semi-natural environment. Although we tested freshwater species (redhorse suckers [Moxostoma spp], freshwater drum [Aplondinotus grunniens], largemouth bass [Micropterus salmoides], and rainbow trout [Oncorhynchus mykiss]), these species are also representative of the hearing physiology and sensitivity of estuarine species that would be affected at tidal energy sites.more » Here, we evaluated changes in fish position relative to different intensities of turbine sound as well as trends in location over time with linear mixed-effects and generalized additive mixed models. We also evaluated changes in the proportion of near-source detections relative to sound intensity and exposure time with generalized linear mixed models and generalized additive models. Models indicated that redhorse suckers may respond to sustained turbine sound by increasing distance from the sound source. Freshwater drum models suggested a mixed response to turbine sound, and largemouth bass and rainbow trout models did not indicate any likely responses to turbine sound. Lastly, findings highlight the importance for future research to utilize accurate localization systems, different species, validated sound transmission distances, and to consider different types of behavioral responses to different turbine designs and to the cumulative sound of arrays of multiple turbines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schramm, Michael P.; Bevelhimer, Mark; Scherelis, Constantin
The development of hydrokinetic energy technologies (e.g., tidal turbines) has raised concern over the potential impacts of underwater sound produced by hydrokinetic turbines on fish species likely to encounter these turbines. To assess the potential for behavioral impacts, we exposed four species of fish to varying intensities of recorded hydrokinetic turbine sound in a semi-natural environment. Although we tested freshwater species (redhorse suckers [Moxostoma spp], freshwater drum [Aplondinotus grunniens], largemouth bass [Micropterus salmoides], and rainbow trout [Oncorhynchus mykiss]), these species are also representative of the hearing physiology and sensitivity of estuarine species that would be affected at tidal energy sites.more » Here, we evaluated changes in fish position relative to different intensities of turbine sound as well as trends in location over time with linear mixed-effects and generalized additive mixed models. We also evaluated changes in the proportion of near-source detections relative to sound intensity and exposure time with generalized linear mixed models and generalized additive models. Models indicated that redhorse suckers may respond to sustained turbine sound by increasing distance from the sound source. Freshwater drum models suggested a mixed response to turbine sound, and largemouth bass and rainbow trout models did not indicate any likely responses to turbine sound. Lastly, findings highlight the importance for future research to utilize accurate localization systems, different species, validated sound transmission distances, and to consider different types of behavioral responses to different turbine designs and to the cumulative sound of arrays of multiple turbines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Jiafu; Phipps, S.J.; Pitman, A.J.
The CSIRO Mk3L climate system model, a reduced-resolution coupled general circulation model, has previously been described in this journal. The model is configured for millennium scale or multiple century scale simulations. This paper reports the impact of replacing the relatively simple land surface scheme that is the default parameterisation in Mk3L with a sophisticated land surface model that simulates the terrestrial energy, water and carbon balance in a physically and biologically consistent way. An evaluation of the new model s near-surface climatology highlights strengths and weaknesses, but overall the atmospheric variables, including the near-surface air temperature and precipitation, are simulatedmore » well. The impact of the more sophisticated land surface model on existing variables is relatively small, but generally positive. More significantly, the new land surface scheme allows an examination of surface carbon-related quantities including net primary productivity which adds significantly to the capacity of Mk3L. Overall, results demonstrate that this reduced-resolution climate model is a good foundation for exploring long time scale phenomena. The addition of the more sophisticated land surface model enables an exploration of important Earth System questions including land cover change and abrupt changes in terrestrial carbon storage.« less
Harrison, David A; Brady, Anthony R; Parry, Gareth J; Carpenter, James R; Rowan, Kathy
2006-05-01
To assess the performance of published risk prediction models in common use in adult critical care in the United Kingdom and to recalibrate these models in a large representative database of critical care admissions. Prospective cohort study. A total of 163 adult general critical care units in England, Wales, and Northern Ireland, during the period of December 1995 to August 2003. A total of 231,930 admissions, of which 141,106 met inclusion criteria and had sufficient data recorded for all risk prediction models. None. The published versions of the Acute Physiology and Chronic Health Evaluation (APACHE) II, APACHE II UK, APACHE III, Simplified Acute Physiology Score (SAPS) II, and Mortality Probability Models (MPM) II were evaluated for discrimination and calibration by means of a combination of appropriate statistical measures recommended by an expert steering committee. All models showed good discrimination (the c index varied from 0.803 to 0.832) but imperfect calibration. Recalibration of the models, which was performed by both the Cox method and re-estimating coefficients, led to improved discrimination and calibration, although all models still showed significant departures from perfect calibration. Risk prediction models developed in another country require validation and recalibration before being used to provide risk-adjusted outcomes within a new country setting. Periodic reassessment is beneficial to ensure calibration is maintained.
Mittal, Manish; Harrison, Donald L; Thompson, David M; Miller, Michael J; Farmer, Kevin C; Ng, Yu-Tze
2016-01-01
While the choice of analytical approach affects study results and their interpretation, there is no consensus to guide the choice of statistical approaches to evaluate public health policy change. This study compared and contrasted three statistical estimation procedures in the assessment of a U.S. Food and Drug Administration (FDA) suicidality warning, communicated in January 2008 and implemented in May 2009, on antiepileptic drug (AED) prescription claims. Longitudinal designs were utilized to evaluate Oklahoma (U.S. State) Medicaid claim data from January 2006 through December 2009. The study included 9289 continuously eligible individuals with prevalent diagnoses of epilepsy and/or psychiatric disorder. Segmented regression models using three estimation procedures [i.e., generalized linear models (GLM), generalized estimation equations (GEE), and generalized linear mixed models (GLMM)] were used to estimate trends of AED prescription claims across three time periods: before (January 2006-January 2008); during (February 2008-May 2009); and after (June 2009-December 2009) the FDA warning. All three statistical procedures estimated an increasing trend (P < 0.0001) in AED prescription claims before the FDA warning period. No procedures detected a significant change in trend during (GLM: -30.0%, 99% CI: -60.0% to 10.0%; GEE: -20.0%, 99% CI: -70.0% to 30.0%; GLMM: -23.5%, 99% CI: -58.8% to 1.2%) and after (GLM: 50.0%, 99% CI: -70.0% to 160.0%; GEE: 80.0%, 99% CI: -20.0% to 200.0%; GLMM: 47.1%, 99% CI: -41.2% to 135.3%) the FDA warning when compared to pre-warning period. Although the three procedures provided consistent inferences, the GEE and GLMM approaches accounted appropriately for correlation. Further, marginal models estimated using GEE produced more robust and valid population-level estimations. Copyright © 2016 Elsevier Inc. All rights reserved.
Predicting motor vehicle collisions using Bayesian neural network models: an empirical analysis.
Xie, Yuanchang; Lord, Dominique; Zhang, Yunlong
2007-09-01
Statistical models have frequently been used in highway safety studies. They can be utilized for various purposes, including establishing relationships between variables, screening covariates and predicting values. Generalized linear models (GLM) and hierarchical Bayes models (HBM) have been the most common types of model favored by transportation safety analysts. Over the last few years, researchers have proposed the back-propagation neural network (BPNN) model for modeling the phenomenon under study. Compared to GLMs and HBMs, BPNNs have received much less attention in highway safety modeling. The reasons are attributed to the complexity for estimating this kind of model as well as the problem related to "over-fitting" the data. To circumvent the latter problem, some statisticians have proposed the use of Bayesian neural network (BNN) models. These models have been shown to perform better than BPNN models while at the same time reducing the difficulty associated with over-fitting the data. The objective of this study is to evaluate the application of BNN models for predicting motor vehicle crashes. To accomplish this objective, a series of models was estimated using data collected on rural frontage roads in Texas. Three types of models were compared: BPNN, BNN and the negative binomial (NB) regression models. The results of this study show that in general both types of neural network models perform better than the NB regression model in terms of data prediction. Although the BPNN model can occasionally provide better or approximately equivalent prediction performance compared to the BNN model, in most cases its prediction performance is worse than the BNN model. In addition, the data fitting performance of the BPNN model is consistently worse than the BNN model, which suggests that the BNN model has better generalization abilities than the BPNN model and can effectively alleviate the over-fitting problem without significantly compromising the nonlinear approximation ability. The results also show that BNNs could be used for other useful analyses in highway safety, including the development of accident modification factors and for improving the prediction capabilities for evaluating different highway design alternatives.
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
Mentoring for population health in general practice divisions.
Moss, John R; Mickan, Sharon M; Fuller, Jeffrey D; Procter, Nicholas G; Waters, Barb A; O'Rourke, Peter K
2006-02-01
This paper describes the implementation and evaluation of a three-way model of service development mentoring. This population health mentoring program was funded by the Commonwealth Department of Health and Ageing to enable staff from eight Divisions of General Practice in South Australia to gain a sound understanding of population health concepts relevant to their workplace. The distinguishing features of service development mentoring were that the learning was grounded within an individual's work setting and experience; there was an identified population health problem or issue confronting the Division of General Practice; and there was an expectation of enhanced organisational performance. A formal evaluation found a consensus among all learners that mentoring was a positive and worthwhile experience, where they had achieved what they had set out to do. Mentors found the model of learning agreeable and effective. Division executive officers recognised enhanced skills among their "learner" colleagues, and commented positively on the benefits to their organisations through the development of well researched and relevant projects, with the potential to improve the efficiency of their population health activities.
Generalized Fractional Derivative Anisotropic Viscoelastic Characterization.
Hilton, Harry H
2012-01-18
Isotropic linear and nonlinear fractional derivative constitutive relations are formulated and examined in terms of many parameter generalized Kelvin models and are analytically extended to cover general anisotropic homogeneous or non-homogeneous as well as functionally graded viscoelastic material behavior. Equivalent integral constitutive relations, which are computationally more powerful, are derived from fractional differential ones and the associated anisotropic temperature-moisture-degree-of-cure shift functions and reduced times are established. Approximate Fourier transform inversions for fractional derivative relations are formulated and their accuracy is evaluated. The efficacy of integer and fractional derivative constitutive relations is compared and the preferential use of either characterization in analyzing isotropic and anisotropic real materials must be examined on a case-by-case basis. Approximate protocols for curve fitting analytical fractional derivative results to experimental data are formulated and evaluated.
NASA Technical Reports Server (NTRS)
Frei, Allan; Nolin, Anne W.; Serreze, Mark C.; Armstrong, Richard L.; McGinnis, David L.; Robinson, David A.
2004-01-01
The purpose of this three-year study is to develop and evaluate techniques to estimate the range of potential hydrological impacts of climate change in mountainous areas. Three main objectives are set out in the proposal. (1) To develop and evaluate transfer functions to link tropospheric circulation to regional snowfall. (2) To evaluate a suite of General Circulation Models (GCMs) for use in estimating synoptic scale circulation and the resultant regional snowfall. And (3) to estimate the range of potential hydrological impacts of changing climate in the two case study areas: the Upper Colorado River basin, and the Catskill Mountains of southeastern New York State. Both regions provide water to large populations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro
2016-07-01
This study presents a new nonlinear programming formulation for the solution of inverse problems. First, a general inverse problem formulation based on the compliance error functional is presented. The proposed error functional enables the computation of the Lagrange multipliers, and thus the first order derivative information, at the expense of just one model evaluation. Therefore, the calculation of the Lagrange multipliers does not require the solution of the computationally intensive adjoint problem. This leads to significant speedups for large-scale, gradient-based inverse problems.
Terkamo-Moisio, Anja; Kvist, Tarja; Laitila, Teuvo; Kangasniemi, Mari; Ryynänen, Olli-Pekka; Pietilä, Anna-Maija
2017-08-01
The debate about euthanasia is ongoing in several countries including Finland. However, there is a lack of information on current attitudes toward euthanasia among general Finnish public. The traditional model for predicting individuals' attitudes to euthanasia is based on their age, gender, educational level, and religiosity. However, a new evaluation of religiosity is needed due to the limited operationalization of this factor in previous studies. This study explores the connections between the factors of the traditional model and the attitudes toward euthanasia among the general public in the Finnish context. The Finnish public's attitudes toward euthanasia have become remarkably more positive over the last decade. Further research is needed on the factors that predict euthanasia attitudes. We suggest two different explanatory models for consideration: one that emphasizes the value of individual autonomy and another that approaches euthanasia from the perspective of fears of death or the process of dying.
Using Modeling and Rehearsal to Teach Fire Safety to Children with Autism
ERIC Educational Resources Information Center
Garcia, David; Dukes, Charles; Brady, Michael P.; Scott, Jack; Wilson, Cynthia L.
2016-01-01
We evaluated the efficacy of an instructional procedure to teach young children with autism to evacuate settings and notify an adult during a fire alarm. A multiple baseline design across children showed that an intervention that included modeling, rehearsal, and praise was effective in teaching fire safety skills. Safety skills generalized to…
Some Useful Cost-Benefit Criteria for Evaluating Computer-Based Test Delivery Models and Systems
ERIC Educational Resources Information Center
Luecht, Richard M.
2005-01-01
Computer-based testing (CBT) is typically implemented using one of three general test delivery models: (1) multiple fixed testing (MFT); (2) computer-adaptive testing (CAT); or (3) multistage testing (MSTs). This article reviews some of the real cost drivers associated with CBT implementation--focusing on item production costs, the costs…
Madeleine Eckmann; Jason Dunham; Edward J. Connor; Carmen A. Welch
2016-01-01
Many species living in deeper lentic ecosystems exhibit daily movements that cycle through the water column, generally referred to as diel vertical migration (DVM). In this study, we applied bioenergetics modelling to evaluate growth as a hypothesis to explain DVM by bull trout (Salvelinus confluentus) in a thermally stratified reservoir (Ross Lake...
ERIC Educational Resources Information Center
Han, Seong Won; Borgonovi, Francesca; Guerriero, Sonia
2018-01-01
This study examines between-country differences in the degree to which teachers' working conditions, salaries, and societal evaluations about desirable job characteristics are associated with students' teaching career expectations. Three-level hierarchical generalized linear models are employed to analyze cross-national data from the Programme for…
A Regional Climate Model Evaluation System based on Satellite and other Observations
NASA Astrophysics Data System (ADS)
Lean, P.; Kim, J.; Waliser, D. E.; Hall, A. D.; Mattmann, C. A.; Granger, S. L.; Case, K.; Goodale, C.; Hart, A.; Zimdars, P.; Guan, B.; Molotch, N. P.; Kaki, S.
2010-12-01
Regional climate models are a fundamental tool needed for downscaling global climate simulations and projections, such as those contributing to the Coupled Model Intercomparison Projects (CMIPs) that form the basis of the IPCC Assessment Reports. The regional modeling process provides the means to accommodate higher resolution and a greater complexity of Earth System processes. Evaluation of both the global and regional climate models against observations is essential to identify model weaknesses and to direct future model development efforts focused on reducing the uncertainty associated with climate projections. However, the lack of reliable observational data and the lack of formal tools are among the serious limitations to addressing these objectives. Recent satellite observations are particularly useful as they provide a wealth of information on many different aspects of the climate system, but due to their large volume and the difficulties associated with accessing and using the data, these datasets have been generally underutilized in model evaluation studies. Recognizing this problem, NASA JPL / UCLA is developing a model evaluation system to help make satellite observations, in conjunction with in-situ, assimilated, and reanalysis datasets, more readily accessible to the modeling community. The system includes a central database to store multiple datasets in a common format and codes for calculating predefined statistical metrics to assess model performance. This allows the time taken to compare model simulations with satellite observations to be reduced from weeks to days. Early results from the use this new model evaluation system for evaluating regional climate simulations over California/western US regions will be presented.
2017-11-01
model of the bridge piers, other related structures, and the adjacent channel. Data from the model provided a qualitative and quantitative evaluation of...minus post-test lidar survey . ......................... 42 Figure 38. Test 1 (30,000 cfs existing conditions) pre- minus post-test lidar survey ...43 Figure 39. Test 7 (15,000 cfs original proposed conditions) pre- minus post-test lidar survey
A survey of Applied Psychological Services' models of the human operator
NASA Technical Reports Server (NTRS)
Siegel, A. I.; Wolf, J. J.
1979-01-01
A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.
He, Xin; Frey, Eric C
2006-08-01
Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.
Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael
2018-06-01
The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. Copyright © 2018 Cognitive Science Society, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, G.S. Jr.; Howarth, R.J.; Schuenemeyer, J.H.
1981-02-01
We have developed a procedure that can help quadrangle evaluators to systematically summarize and use hydrogeochemical and stream sediment reconnaissance (HSSR) and occurrence data. Although we have not provided an independent estimate of uranium endowment, we have devised a methodology that will provide this independent estimate when additional calibration is done by enlarging the study area. Our statistical model for evaluation (system EVAL) ranks uranium endowment for each quadrangle. Because using this model requires experience in geology, statistics, and data analysis, we have also devised a simplified model, presented in the package SURE, a System for Uranium Resource Evaluation. Wemore » have developed and tested these models for the four quadrangles in southern Colorado that comprise the study area; to investigate their generality, the models should be applied to other quandrangles. Once they are calibrated with accepted uranium endowments for several well-known quadrangles, the models can be used to give independent estimates for less-known quadrangles. The point-oriented models structure the objective comparison of the quandrangles on the bases of: (1) Anomalies (a) derived from stream sediments, (b) derived from waters (stream, well, pond, etc.), (2) Geology (a) source rocks, as defined by the evaluator, (b) host rocks, as defined by the evaluator, and (3) Aerial radiometric anomalies.« less
Air Conditioning Modifications to AMG Buses
DOT National Transportation Integrated Search
1983-12-01
This report presents the documentation and evaluation of air conditioning system modifications devised by Miami (Florida) Metrobus and Los Angeles SCRTD for the AM General Model B bus. The objective of these modifications was to reduce the frequency ...
An Evaluation of Human Thermal Models for the Study of Immersion Hypothermia Protection Equipment
1979-10-12
exhibited by the five experimental observations, largely due to somatotype differences among the subjects. None of the individual responses Is represented...not less than 35°C). A mathematical model capable of accurately simulating the thermal responses of a protected man in a cold environment would be an...flow) responses . The models are most generally expressed as a set of differential equa- tions. Early models were solved using analog computers. The
Discrete adjoint of fractional step Navier-Stokes solver in generalized coordinates
NASA Astrophysics Data System (ADS)
Wang, Mengze; Mons, Vincent; Zaki, Tamer
2017-11-01
Optimization and control in transitional and turbulent flows require evaluation of gradients of the flow state with respect to the problem parameters. Using adjoint approaches, these high-dimensional gradients can be evaluated with a similar computational cost as the forward Navier-Stokes simulations. The adjoint algorithm can be obtained by discretizing the continuous adjoint Navier-Stokes equations or by deriving the adjoint to the discretized Navier-Stokes equations directly. The latter algorithm is necessary when the forward-adjoint relations must be satisfied to machine precision. In this work, our forward model is the fractional step solution to the Navier-Stokes equations in generalized coordinates, proposed by Rosenfeld, Kwak & Vinokur. We derive the corresponding discrete adjoint equations. We also demonstrate the accuracy of the combined forward-adjoint model, and its application to unsteady wall-bounded flows. This work has been partially funded by the Office of Naval Research (Grant N00014-16-1-2542).
A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors
NASA Technical Reports Server (NTRS)
Liu, Donhang
2014-01-01
The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.
A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors
NASA Technical Reports Server (NTRS)
Liu, Donhang
2014-01-01
The evaluation for potential space project applications of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material requires an in-depth understanding of the MLCCs reliability. A general reliability model for Ni-BaTiO3 MLCCs is developed and discussed in this paper. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitors reliability life responds to external stresses; and an empirical function that defines the contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.
Yang, P C; Zhang, S X; Sun, P P; Cai, Y L; Lin, Y; Zou, Y H
2017-07-10
Objective: To construct the Markov models to reflect the reality of prevention and treatment interventions against hepatitis B virus (HBV) infection, simulate the natural history of HBV infection in different age groups and provide evidence for the economics evaluations of hepatitis B vaccination and population-based antiviral treatment in China. Methods: According to the theory and techniques of Markov chain, the Markov models of Chinese HBV epidemic were developed based on the national data and related literature both at home and abroad, including the settings of Markov model states, allowable transitions and initial and transition probabilities. The model construction, operation and verification were conducted by using software TreeAge Pro 2015. Results: Several types of Markov models were constructed to describe the disease progression of HBV infection in neonatal period, perinatal period or adulthood, the progression of chronic hepatitis B after antiviral therapy, hepatitis B prevention and control in adults, chronic hepatitis B antiviral treatment and the natural progression of chronic hepatitis B in general population. The model for the newborn was fundamental which included ten states, i.e . susceptiblity to HBV, HBsAg clearance, immune tolerance, immune clearance, low replication, HBeAg negative CHB, compensated cirrhosis, decompensated cirrhosis, hepatocellular carcinoma (HCC) and death. The susceptible state to HBV was excluded in the perinatal period model, and the immune tolerance state was excluded in the adulthood model. The model for general population only included two states, survive and death. Among the 5 types of models, there were 9 initial states assigned with initial probabilities, and 27 states for transition probabilities. The results of model verifications showed that the probability curves were basically consistent with the situation of HBV epidemic in China. Conclusion: The Markov models developed can be used in economics evaluation of hepatitis B vaccination and treatment for the elimination of HBV infection in China though the structures and parameters in the model have uncertainty with dynamic natures.
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, J.; Polly, B.; Collis, J.
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Research on Nonlinear Time Series Forecasting of Time-Delay NN Embedded with Bayesian Regularization
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yusheng; Xu, Yuhui; Wang, Jianmin
Based on the idea of nonlinear prediction of phase space reconstruction, this paper presented a time delay BP neural network model, whose generalization capability was improved by Bayesian regularization. Furthermore, the model is applied to forecast the imp&exp trades in one industry. The results showed that the improved model has excellent generalization capabilities, which not only learned the historical curve, but efficiently predicted the trend of business. Comparing with common evaluation of forecasts, we put on a conclusion that nonlinear forecast can not only focus on data combination and precision improvement, it also can vividly reflect the nonlinear characteristic of the forecasting system. While analyzing the forecasting precision of the model, we give a model judgment by calculating the nonlinear characteristic value of the combined serial and original serial, proved that the forecasting model can reasonably 'catch' the dynamic characteristic of the nonlinear system which produced the origin serial.
Uncertain programming models for portfolio selection with uncertain returns
NASA Astrophysics Data System (ADS)
Zhang, Bo; Peng, Jin; Li, Shengguo
2015-10-01
In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.
Moisen, Gretchen G.; Freeman, E.A.; Blackard, J.A.; Frescino, T.S.; Zimmermann, N.E.; Edwards, T.C.
2006-01-01
Many efforts are underway to produce broad-scale forest attribute maps by modelling forest class and structure variables collected in forest inventories as functions of satellite-based and biophysical information. Typically, variants of classification and regression trees implemented in Rulequest's?? See5 and Cubist (for binary and continuous responses, respectively) are the tools of choice in many of these applications. These tools are widely used in large remote sensing applications, but are not easily interpretable, do not have ties with survey estimation methods, and use proprietary unpublished algorithms. Consequently, three alternative modelling techniques were compared for mapping presence and basal area of 13 species located in the mountain ranges of Utah, USA. The modelling techniques compared included the widely used See5/Cubist, generalized additive models (GAMs), and stochastic gradient boosting (SGB). Model performance was evaluated using independent test data sets. Evaluation criteria for mapping species presence included specificity, sensitivity, Kappa, and area under the curve (AUC). Evaluation criteria for the continuous basal area variables included correlation and relative mean squared error. For predicting species presence (setting thresholds to maximize Kappa), SGB had higher values for the majority of the species for specificity and Kappa, while GAMs had higher values for the majority of the species for sensitivity. In evaluating resultant AUC values, GAM and/or SGB models had significantly better results than the See5 models where significant differences could be detected between models. For nine out of 13 species, basal area prediction results for all modelling techniques were poor (correlations less than 0.5 and relative mean squared errors greater than 0.8), but SGB provided the most stable predictions in these instances. SGB and Cubist performed equally well for modelling basal area for three species with moderate prediction success, while all three modelling tools produced comparably good predictions (correlation of 0.68 and relative mean squared error of 0.56) for one species. ?? 2006 Elsevier B.V. All rights reserved.
Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...
2016-02-05
Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jiali; Han, Yuefeng; Stein, Michael L.
2016-02-10
The Weather Research and Forecast (WRF) model downscaling skill in extreme maximum daily temperature is evaluated by using the generalized extreme value (GEV) distribution. While the GEV distribution has been used extensively in climatology and meteorology for estimating probabilities of extreme events, accurately estimating GEV parameters based on data from a single pixel can be difficult, even with fairly long data records. This work proposes a simple method assuming that the shape parameter, the most difficult of the three parameters to estimate, does not vary over a relatively large region. This approach is applied to evaluate 31-year WRF-downscaled extreme maximummore » temperature through comparison with North American Regional Reanalysis (NARR) data. Uncertainty in GEV parameter estimates and the statistical significance in the differences of estimates between WRF and NARR are accounted for by conducting bootstrap resampling. Despite certain biases over parts of the United States, overall, WRF shows good agreement with NARR in the spatial pattern and magnitudes of GEV parameter estimates. Both WRF and NARR show a significant increase in extreme maximum temperature over the southern Great Plains and southeastern United States in January and over the western United States in July. The GEV model shows clear benefits from the regionally constant shape parameter assumption, for example, leading to estimates of the location and scale parameters of the model that show coherent spatial patterns.« less
Boxwala, Aziz A; Kim, Jihoon; Grillo, Janice M; Ohno-Machado, Lucila
2011-01-01
To determine whether statistical and machine-learning methods, when applied to electronic health record (EHR) access data, could help identify suspicious (ie, potentially inappropriate) access to EHRs. From EHR access logs and other organizational data collected over a 2-month period, the authors extracted 26 features likely to be useful in detecting suspicious accesses. Selected events were marked as either suspicious or appropriate by privacy officers, and served as the gold standard set for model evaluation. The authors trained logistic regression (LR) and support vector machine (SVM) models on 10-fold cross-validation sets of 1291 labeled events. The authors evaluated the sensitivity of final models on an external set of 58 events that were identified as truly inappropriate and investigated independently from this study using standard operating procedures. The area under the receiver operating characteristic curve of the models on the whole data set of 1291 events was 0.91 for LR, and 0.95 for SVM. The sensitivity of the baseline model on this set was 0.8. When the final models were evaluated on the set of 58 investigated events, all of which were determined as truly inappropriate, the sensitivity was 0 for the baseline method, 0.76 for LR, and 0.79 for SVM. The LR and SVM models may not generalize because of interinstitutional differences in organizational structures, applications, and workflows. Nevertheless, our approach for constructing the models using statistical and machine-learning techniques can be generalized. An important limitation is the relatively small sample used for the training set due to the effort required for its construction. The results suggest that statistical and machine-learning methods can play an important role in helping privacy officers detect suspicious accesses to EHRs.
Kim, Jihoon; Grillo, Janice M; Ohno-Machado, Lucila
2011-01-01
Objective To determine whether statistical and machine-learning methods, when applied to electronic health record (EHR) access data, could help identify suspicious (ie, potentially inappropriate) access to EHRs. Methods From EHR access logs and other organizational data collected over a 2-month period, the authors extracted 26 features likely to be useful in detecting suspicious accesses. Selected events were marked as either suspicious or appropriate by privacy officers, and served as the gold standard set for model evaluation. The authors trained logistic regression (LR) and support vector machine (SVM) models on 10-fold cross-validation sets of 1291 labeled events. The authors evaluated the sensitivity of final models on an external set of 58 events that were identified as truly inappropriate and investigated independently from this study using standard operating procedures. Results The area under the receiver operating characteristic curve of the models on the whole data set of 1291 events was 0.91 for LR, and 0.95 for SVM. The sensitivity of the baseline model on this set was 0.8. When the final models were evaluated on the set of 58 investigated events, all of which were determined as truly inappropriate, the sensitivity was 0 for the baseline method, 0.76 for LR, and 0.79 for SVM. Limitations The LR and SVM models may not generalize because of interinstitutional differences in organizational structures, applications, and workflows. Nevertheless, our approach for constructing the models using statistical and machine-learning techniques can be generalized. An important limitation is the relatively small sample used for the training set due to the effort required for its construction. Conclusion The results suggest that statistical and machine-learning methods can play an important role in helping privacy officers detect suspicious accesses to EHRs. PMID:21672912
Flexibility evaluation of multiechelon supply chains.
Almeida, João Flávio de Freitas; Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda
2018-01-01
Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution.
Flexibility evaluation of multiechelon supply chains
Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda
2018-01-01
Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution. PMID:29584755
Evaluation of the Tropical Pacific Observing System from the Data Assimilation Perspective
2014-01-01
hereafter, SIDA systems) have the capacity to assimilate salinity profiles imposing a multivariate (mainly T-S) balance relationship (summarized in...Fujii et al., 2011). Current SIDA systems in operational centers generally use Ocean General Circulation Models (OGCM) with resolution typically 1...long-term (typically 20-30 years) ocean DA runs are often performed with SIDA systems in operational centers for validation and calibration of SI
ERIC Educational Resources Information Center
Deri, Melissa A.; Mills, Pamela; McGregor, Donna
2018-01-01
A flipped classroom is one where students are first introduced to content outside of the classroom. This frees up class time for more active learning strategies and has been shown to enhance student learning in high school and college classrooms. However, many studies in General Chemistry, a college gateway science course, were conducted in small…
ERIC Educational Resources Information Center
Wallace, Colin S.; Prather, Edward E.; Duncan, Douglas K.
2012-01-01
This is the third of five papers detailing our national study of general education astronomy students' conceptual and reasoning difficulties with cosmology. In this paper, we use item response theory to analyze students' responses to three out of the four conceptual cosmology surveys we developed. The specific item response theory model we use is…
On double shearing in frictional materials
NASA Astrophysics Data System (ADS)
Teunissen, J. A. M.
2007-01-01
This paper evaluates the mechanical behaviour of yielding frictional geomaterials. The general Double Shearing model describes this behaviour. Non-coaxiality of stress and plastic strain increments for plane strain conditions forms an important part of this model. The model is based on a micro-mechanical and macro-mechanical formulation. The stress-dilatancy theory in the model combines the mechanical behaviour on both scales.It is shown that the general Double Shearing formulation comprises other Double Shearing models. These models differ in the relation between the mobilized friction and dilatancy and in non-coaxiality. In order to describe reversible and irreversible deformations the general Double Shearing model is extended with elasticity.The failure of soil masses is controlled by shear mechanisms. These shear mechanisms are determined by the conditions along the shear band. The shear stress ratio of a shear band depends on the orientation of the stress in the shear band. There is a difference between the peak strength and the residual strength in the shear band. While peak stress depends on strength properties only, the residual strength depends upon the yield conditions and the plastic deformation mechanisms and is generally considerably lower than the maximum strength. It is shown that non-coaxial models give non-unique solutions for the shear stress ratio on the shear band. The Double Shearing model is applied to various failure problems of soils such as the direct simple shear test, the biaxial test, infinite slopes, interfaces and for the calculation of the undrained shear strength. Copyright
The use of vestibular models for design and evaluation of flight simulator motion
NASA Technical Reports Server (NTRS)
Bussolari, Steven R.; Young, Laurence R.; Lee, Alfred T.
1989-01-01
Quantitative models for the dynamics of the human vestibular system are applied to the design and evaluation of flight simulator platform motion. An optimal simulator motion control algorithm is generated to minimize the vector difference between perceived spatial orientation estimated in flight and in simulation. The motion controller has been implemented on the Vertical Motion Simulator at NASA Ames Research Center and evaluated experimentally through measurement of pilot performance and subjective rating during VTOL aircraft simulation. In general, pilot performance in a longitudinal tracking task (formation flight) did not appear to be sensitive to variations in platform motion condition as long as motion was present. However, pilot assessment of motion fidelity by means of a rating scale designed for this purpose, were sensitive to motion controller design. Platform motion generated with the optimal motion controller was found to be generally equivalent to that generated by conventional linear crossfeed washout. The vestibular models are used to evaluate the motion fidelity of transport category aircraft (Boeing 727) simulation in a pilot performance and simulator acceptability study at the Man-Vehicle Systems Research Facility at NASA Ames Research Center. Eighteen airline pilots, currently flying B-727, were given a series of flight scenarios in the simulator under various conditions of simulator motion. The scenarios were chosen to reflect the flight maneuvers that these pilots might expect to be given during a routine pilot proficiency check. Pilot performance and subjective rating of simulator fidelity was relatively insensitive to the motion condition, despite large differences in the amplitude of motion provided. This lack of sensitivity may be explained by means of the vestibular models, which predict little difference in the modeled motion sensations of the pilots when different motion conditions are imposed.
Evaluating CMIP5 Simulations of Historical Continental Climate with Koeppen Bioclimatic Metrics
NASA Astrophysics Data System (ADS)
Phillips, T. J.; Bonfils, C.
2013-12-01
The classic Koeppen bioclimatic classification scheme associates generic vegetation types (e.g. grassland, tundra, broadleaf or evergreen forests, etc.) with regional climate zones defined by their annual cycles of continental temperature (T) and precipitation (P), considered together. The locations or areas of Koeppen vegetation types derived from observational data thus can provide concise metrical standards for simultaneously evaluating climate simulations of T and P in naturally defined regions. The CMIP5 models' collective ability to correctly represent two variables that are critically important for living organisms at regional scales is therefore central to this evaluation. For this study, 14 Koeppen vegetation types are derived from annual-cycle climatologies of T and P in some 3 dozen CMIP5 simulations of the 1980-1999 period. Metrics for evaluating the ability of the CMIP5 models to simulate the correct locations and areas of each vegetation type, as well as measures of overall model performance, also are developed. It is found that the CMIP5 models are generally most deficient in simulating: 1) climates of drier Koeppen zones (e.g. desert, savanna, grassland, steppe vegetation types) located in the southwestern U.S. and Mexico, eastern Europe, southern Africa, and central Australia; 2) climates of regions such as central Asia and western South America where topography plays a key role. Details of regional T or P biases in selected simulations that exemplify general model performance problems also will be presented. Acknowledgments: This work was funded by the U.S. Department of Energy Office of Science and was performed at the Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Map of Koeppen vegetation types derived from observed T and P.
Multi-objective optimization for generating a weighted multi-model ensemble
NASA Astrophysics Data System (ADS)
Lee, H.
2017-12-01
Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.
Geoengineering as a design problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kravitz, Ben; MacMartin, Douglas G.; Wang, Hailong
2016-01-01
Understanding the climate impacts of solar geoengineering is essential for evaluating its benefits and risks. Most previous simulations have prescribed a particular strategy and evaluated its modeled effects. Here we turn this approach around by first choosing example climate objectives and then designing a strategy to meet those objectives in climate models. There are four essential criteria for designing a strategy: (i) an explicit specification of the objectives, (ii) defining what climate forcing agents to modify so the objectives are met, (iii) a method for managing uncertainties, and (iv) independent verification of the strategy in an evaluation model. We demonstrate this design perspective throughmore » two multi-objective examples. First, changes in Arctic temperature and the position of tropical precipitation due to CO 2 increases are offset by adjusting high-latitude insolation in each hemisphere independently. Second, three different latitude-dependent patterns of insolation are modified to offset CO 2-induced changes in global mean temperature, interhemispheric temperature asymmetry, and the Equator-to-pole temperature gradient. In both examples, the "design" and "evaluation" models are state-of-the-art fully coupled atmosphere–ocean general circulation models.« less
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2015-08-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
NASA Astrophysics Data System (ADS)
Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2016-04-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
Reliability models: the influence of model specification in generation expansion planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stremel, J.P.
1982-10-01
This paper is a critical evaluation of reliability methods used for generation expansion planning. It is shown that the methods for treating uncertainty are critical for determining the relative reliability value of expansion alternatives. It is also shown that the specification of the reliability model will not favor all expansion options equally. Consequently, the model is biased. In addition, reliability models should be augmented with an economic value of reliability (such as the cost of emergency procedures or energy not served). Generation expansion evaluations which ignore the economic value of excess reliability can be shown to be inconsistent. The conclusionsmore » are that, in general, a reliability model simplifies generation expansion planning evaluations. However, for a thorough analysis, the expansion options should be reviewed for candidates which may be unduly rejected because of the bias of the reliability model. And this implies that for a consistent formulation in an optimization framework, the reliability model should be replaced with a full economic optimization which includes the costs of emergency procedures and interruptions in the objective function.« less
Comparison of in silico models for prediction of mutagenicity.
Bakhtyari, Nazanin G; Raitano, Giuseppa; Benfenati, Emilio; Martin, Todd; Young, Douglas
2013-01-01
Using a dataset with more than 6000 compounds, the performance of eight quantitative structure activity relationships (QSAR) models was evaluated: ACD/Tox Suite, Absorption, Distribution, Metabolism, Elimination, and Toxicity of chemical substances (ADMET) predictor, Derek, Toxicity Estimation Software Tool (T.E.S.T.), TOxicity Prediction by Komputer Assisted Technology (TOPKAT), Toxtree, CEASAR, and SARpy (SAR in python). In general, the results showed a high level of performance. To have a realistic estimate of the predictive ability, the results for chemicals inside and outside the training set for each model were considered. The effect of applicability domain tools (when available) on the prediction accuracy was also evaluated. The predictive tools included QSAR models, knowledge-based systems, and a combination of both methods. Models based on statistical QSAR methods gave better results.
NASA Astrophysics Data System (ADS)
Dambreville, Frédéric
2013-10-01
While there is a variety of approaches and algorithms for optimizing the mission of an unmanned moving sensor, there are much less works which deal with the implementation of several sensors within a human organization. In this case, the management of the sensors is done through at least one human decision layer, and the sensors management as a whole arises as a bi-level optimization process. In this work, the following hypotheses are considered as realistic: Sensor handlers of first level plans their sensors by means of elaborated algorithmic tools based on accurate modelling of the environment; Higher level plans the handled sensors according to a global observation mission and on the basis of an approximated model of the environment and of the first level sub-processes. This problem is formalized very generally as the maximization of an unknown function, defined a priori by sampling a known random function (law of model error). In such case, each actual evaluation of the function increases the knowledge about the function, and subsequently the efficiency of the maximization. The issue is to optimize the sequence of value to be evaluated, in regards to the evaluation costs. There is here a fundamental link with the domain of experiment design. Jones, Schonlau and Welch proposed a general method, the Efficient Global Optimization (EGO), for solving this problem in the case of additive functional Gaussian law. In our work, a generalization of the EGO is proposed, based on a rare event simulation approach. It is applied to the aforementioned bi-level sensor planning.
Evaluation of a black-footed ferret resource utilization function model
Eads, D.A.; Millspaugh, J.J.; Biggins, D.E.; Jachowski, D.S.; Livieri, T.M.
2011-01-01
Resource utilization function (RUF) models permit evaluation of potential habitat for endangered species; ideally such models should be evaluated before use in management decision-making. We evaluated the predictive capabilities of a previously developed black-footed ferret (Mustela nigripes) RUF. Using the population-level RUF, generated from ferret observations at an adjacent yet distinct colony, we predicted the distribution of ferrets within a black-tailed prairie dog (Cynomys ludovicianus) colony in the Conata Basin, South Dakota, USA. We evaluated model performance, using data collected during post-breeding spotlight surveys (2007-2008) by assessing model agreement via weighted compositional analysis and count-metrics. Compositional analysis of home range use and colony-level availability, and core area use and home range availability, demonstrated ferret selection of the predicted Very high and High occurrence categories in 2007 and 2008. Simple count-metrics corroborated these findings and suggested selection of the Very high category in 2007 and the Very high and High categories in 2008. Collectively, these results suggested that the RUF was useful in predicting occurrence and intensity of space use of ferrets at our study site, the 2 objectives of the RUF. Application of this validated RUF would increase the resolution of habitat evaluations, permitting prediction of the distribution of ferrets within distinct colonies. Additional model evaluation at other sites, on other black-tailed prairie dog colonies of varying resource configuration and size, would increase understanding of influences upon model performance and the general utility of the RUF. ?? 2011 The Wildlife Society.
ARM - Midlatitude Continental Convective Clouds
Jensen, Mike; Bartholomew, Mary Jane; Genio, Anthony Del; Giangrande, Scott; Kollias, Pavlos
2012-01-19
Convective processes play a critical role in the Earth's energy balance through the redistribution of heat and moisture in the atmosphere and their link to the hydrological cycle. Accurate representation of convective processes in numerical models is vital towards improving current and future simulations of Earths climate system. Despite improvements in computing power, current operational weather and global climate models are unable to resolve the natural temporal and spatial scales important to convective processes and therefore must turn to parameterization schemes to represent these processes. In turn, parameterization schemes in cloud-resolving models need to be evaluated for their generality and application to a variety of atmospheric conditions. Data from field campaigns with appropriate forcing descriptors have been traditionally used by modelers for evaluating and improving parameterization schemes.
ARM - Midlatitude Continental Convective Clouds (comstock-hvps)
Jensen, Mike; Comstock, Jennifer; Genio, Anthony Del; Giangrande, Scott; Kollias, Pavlos
2012-01-06
Convective processes play a critical role in the Earth's energy balance through the redistribution of heat and moisture in the atmosphere and their link to the hydrological cycle. Accurate representation of convective processes in numerical models is vital towards improving current and future simulations of Earths climate system. Despite improvements in computing power, current operational weather and global climate models are unable to resolve the natural temporal and spatial scales important to convective processes and therefore must turn to parameterization schemes to represent these processes. In turn, parameterization schemes in cloud-resolving models need to be evaluated for their generality and application to a variety of atmospheric conditions. Data from field campaigns with appropriate forcing descriptors have been traditionally used by modelers for evaluating and improving parameterization schemes.
Construction of road network vulnerability evaluation index based on general travel cost
NASA Astrophysics Data System (ADS)
Leng, Jun-qiang; Zhai, Jing; Li, Qian-wen; Zhao, Lin
2018-03-01
With the development of China's economy and the continuous improvement of her urban road network, the vulnerability of the urban road network has attracted increasing attention. Based on general travel cost, this work constructs the vulnerability evaluation index for the urban road network, and evaluates the vulnerability of the urban road network from the perspective of user generalised travel cost. Firstly, the generalised travel cost model is constructed based on vehicle cost, travel time, and traveller comfort. Then, the network efficiency index is selected as an evaluation index of vulnerability: the network efficiency index is composed of the traffic volume and the generalised travel cost, which are obtained from the equilibrium state of the network. In addition, the research analyses the influence of traffic capacity decrease, road section attribute value, and location of road section, on vulnerability. Finally, the vulnerability index is used to analyse the local area network of Harbin and verify its applicability.
Estimating Model Probabilities using Thermodynamic Markov Chain Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Ye, M.; Liu, P.; Beerli, P.; Lu, D.; Hill, M. C.
2014-12-01
Markov chain Monte Carlo (MCMC) methods are widely used to evaluate model probability for quantifying model uncertainty. In a general procedure, MCMC simulations are first conducted for each individual model, and MCMC parameter samples are then used to approximate marginal likelihood of the model by calculating the geometric mean of the joint likelihood of the model and its parameters. It has been found the method of evaluating geometric mean suffers from the numerical problem of low convergence rate. A simple test case shows that even millions of MCMC samples are insufficient to yield accurate estimation of the marginal likelihood. To resolve this problem, a thermodynamic method is used to have multiple MCMC runs with different values of a heating coefficient between zero and one. When the heating coefficient is zero, the MCMC run is equivalent to a random walk MC in the prior parameter space; when the heating coefficient is one, the MCMC run is the conventional one. For a simple case with analytical form of the marginal likelihood, the thermodynamic method yields more accurate estimate than the method of using geometric mean. This is also demonstrated for a case of groundwater modeling with consideration of four alternative models postulated based on different conceptualization of a confining layer. This groundwater example shows that model probabilities estimated using the thermodynamic method are more reasonable than those obtained using the geometric method. The thermodynamic method is general, and can be used for a wide range of environmental problem for model uncertainty quantification.
The Associations of Biculturalism to Prosocial Tendencies and Positive Self Evaluations.
Carlo, Gustavo; Basilio, Camille D; Knight, George P
2016-11-01
Although some research exists on biculturalism and negative adjustment, few studies have examined the mechanisms that account for the positive correlates of biculturalism in U.S. Latino youth. Two competing reverse causal models were tested. Specifically, we examined how biculturalism among 574 U.S. Mexican adolescents ( n =296 girls; M = 17.84 years, SD = .46 years) was related to prosocial tendencies and positive self evaluation (i.e., self-esteem and general self-efficacy). The findings yielded supportive evidence for both reverse causal models suggesting that prosocial tendencies may mediate the relations between biculturalism and positive self evaluations, and that positive self evaluations may mediate the relations between biculturalism and prosocial tendencies. The implications of the role of biculturalism for understanding prosocial development and positive self evaluations in U.S. Mexican youth are discussed.
The Associations of Biculturalism to Prosocial Tendencies and Positive Self Evaluations
Carlo, Gustavo; Basilio, Camille D.; Knight, George P.
2016-01-01
Although some research exists on biculturalism and negative adjustment, few studies have examined the mechanisms that account for the positive correlates of biculturalism in U.S. Latino youth. Two competing reverse causal models were tested. Specifically, we examined how biculturalism among 574 U.S. Mexican adolescents (n =296 girls; M = 17.84 years, SD = .46 years) was related to prosocial tendencies and positive self evaluation (i.e., self-esteem and general self-efficacy). The findings yielded supportive evidence for both reverse causal models suggesting that prosocial tendencies may mediate the relations between biculturalism and positive self evaluations, and that positive self evaluations may mediate the relations between biculturalism and prosocial tendencies. The implications of the role of biculturalism for understanding prosocial development and positive self evaluations in U.S. Mexican youth are discussed. PMID:28018755
Evaluation of Planetary Boundary Layer Scheme Sensitivities for the Purpose of Parameter Estimation
Meteorological model errors caused by imperfect parameterizations generally cannot be overcome simply by optimizing initial and boundary conditions. However, advanced data assimilation methods are capable of extracting significant information about parameterization behavior from ...
NREL and General Motors Announce R&D Partnership to Reduce Cost of
modeling studies to a multiyear National Fuel Cell Electric Vehicle Learning Demonstration. Data from the Learning Demonstration were sent to the National Fuel Cell Technology Evaluation Center (NFCTEC), a data
Modeling of Turbulent Swirling Flows
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Zhu, Jiang; Liou, William; Chen, Kuo-Huey; Liu, Nan-Suey; Lumley, John L.
1997-01-01
Aircraft engine combustors generally involve turbulent swirling flows in order to enhance fuel-air mixing and flame stabilization. It has long been recognized that eddy viscosity turbulence models are unable to appropriately model swirling flows. Therefore, it has been suggested that, for the modeling of these flows, a second order closure scheme should be considered because of its ability in the modeling of rotational and curvature effects. However, this scheme will require solution of many complicated second moment transport equations (six Reynolds stresses plus other scalar fluxes and variances), which is a difficult task for any CFD implementations. Also, this scheme will require a large amount of computer resources for a general combustor swirling flow. This report is devoted to the development of a cubic Reynolds stress-strain model for turbulent swirling flows, and was inspired by the work of Launder's group at UMIST. Using this type of model, one only needs to solve two turbulence equations, one for the turbulent kinetic energy k and the other for the dissipation rate epsilon. The cubic model developed in this report is based on a general Reynolds stress-strain relationship. Two flows have been chosen for model evaluation. One is a fully developed rotating pipe flow, and the other is a more complex flow with swirl and recirculation.
Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data
Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao
2012-01-01
Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976
Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data.
Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao
2013-01-01
Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions.
NASA Astrophysics Data System (ADS)
Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.
2012-04-01
Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.
Ichinokawa, Momoko; Okamura, Hiroshi; Watanabe, Chikako; Kawabata, Atsushi; Oozeki, Yoshioki
2015-09-01
Restricting human access to a specific wildlife species, community, or ecosystem, i.e., input control, is one of the most popular tools to control human impacts for natural resource management and wildlife conservation. However, quantitative evaluations of input control are generally difficult, because it is unclear how much human impacts can actually be reduced by the control. We present a model framework to quantify the effectiveness of input control using day closures to reduce actual fishing impact by considering the observed fishery dynamics. The model framework was applied to the management of the Pacific stock of the chub mackerel (Scomber japonicus) fishery, in which fishing was suspended for one day following any day when the total mackerel catch exceeded a threshold level. We evaluated the management measure according to the following steps: (1) we fitted the daily observed catch and fishing effort data to a generalized linear model (GLM) or generalized autoregressive state-space model (GASSM), (2) we conducted population dynamics simulations based on annual catches randomly generated from the parameters estimated in the first step, (3) we quantified the effectiveness of day closures by comparing the results of two simulation scenarios with and without day closures, and (4) we conducted additional simulations based on different sets of explanatory variables and statistical models (sensitivity analysis). In the first step, we found that the GASSM explained the observed data far better than the simple GLM. The model parameterized with the estimates from the GASSM demonstrated that the day closures implemented from 2004 to 2009 would have decreased exploitation fractions by ~10% every year and increased the 2009 stock biomass by 37-46% (median), relative to the values without day closures. The sensitivity analysis revealed that the effectiveness of day closures was particularly influenced by autoregressive processes in the fishery data and by positive relationships between fishing effort and total biomass. Those results indicated the importance of human behavioral dynamics under input control in quantifying the conservation benefit of natural resource management and the applicability of our model framework to the evaluation of the input controls that are actually implemented.
Xue, Jianping; Zartarian, Valerie; Tornero-Velez, Rogelio; Tulve, Nicolle S
2014-12-01
The U.S. EPA's SHEDS-Multimedia model was applied to enhance the understanding of children's exposures and doses to multiple pyrethroid pesticides, including major contributing chemicals and pathways. This paper presents combined dietary and residential exposure estimates and cumulative doses for seven commonly used pyrethroids, and comparisons of model evaluation results with NHANES biomarker data for 3-PBA and DCCA metabolites. Model input distributions were fit to publicly available pesticide usage survey data, NHANES, and other studies, then SHEDS-Multimedia was applied to estimate total pyrethroid exposures and doses for 3-5 year olds for one year variability simulations. For dose estimations we used a pharmacokinetic model and two approaches for simulating dermal absorption. SHEDS-Multimedia predictions compared well to NHANES biomarker data: ratios of 3-PBA observed data to SHEDS-Multimedia modeled results were 0.88, 0.51, 0.54 and 1.02 for mean, median, 95th, and 99th percentiles, respectively; for DCCA, the ratios were 0.82, 0.53, 0.56, and 0.94. Modeled time-averaged cumulative absorbed dose of the seven pyrethroids was 3.1 nmol/day (versus 8.4 nmol/day for adults) in the general population (residential pyrethroid use and non-use homes) and 6.7 nmol/day (versus 10.5 nmol/day for adults) in the simulated residential pyrethroid use population. For the general population, contributions to modeled cumulative dose by chemical were permethrin (60%), cypermethrin (22%), and cyfluthrin (16%); for residential use homes, contributions were cypermethrin (49%), permethrin (29%), and cyfluthrin (17%). The primary exposure route for 3-5 year olds in the simulated residential use population was non-dietary ingestion exposure; whereas for the simulated general population, dietary exposure was the primary exposure route. Below the 95th percentile, the major exposure pathway was dietary for the general population; non-dietary ingestion was the major pathway starting below the 70th percentile for the residential use population. The new dermal absorption methodology considering surface loading had some impact, but did not change the order of key pathways. Published by Elsevier Ltd.
Emami, Hassan; Radfar, Reza
2017-01-01
The current situation in Iran suggests an appropriate basis for developing biotechnology industries, because the patents for the majority of hi-tech medicines registered in developed countries are ending. Biosimilar and technology-oriented companies which do not have patents will have the opportunity to enter the biosimilar market and move toward innovative initiatives. The present research proposed a model by which one can evaluate commercialization of achievements obtained from research with a focus on the pharmaceutical biotechnology industry. This is a descriptive-analytic study where mixed methodology is followed by a heuristic approach. The statistical population was pharmaceutical biotechnology experts at universities and research centers in Iran. Structural equations were employed in this research. The results indicate that there are three effective layers within commercialization in the proposed model. These are a general layer (factors associated with management, human capital, legal infrastructure, communication infrastructure, a technical and executive infrastructures, and financial factors), industrial layer (internal industrial factors and pharmaceutical industry factors), and a third layer that included national and international aspects. These layers comprise 6 domains, 21 indices, 41 dimensions, and 126 components. Compilation of these layers (general layer, industrial layer, and national and international aspects) can serve commercialization of research and development as an effective evaluation package. PMID:29201110
Olson, Jonathan R; McCarthy, Kimberly J; Perkins, Daniel F; Borden, Lynne M
2018-04-01
The Children, Youth, and Families At-Risk (CYFAR) initiative provides funding and technical support for local community-based programs designed to promote positive outcomes among vulnerable populations. In 2013, CYFAR implemented significant changes in the way it provides technical assistance (TA) to grantees. These changes included introducing a new TA model in which trained coaches provide proactive support that is tailored to individual CYFAR projects. The purpose of this paper is to describe the evolution of this TA model and present preliminary findings from a formative evaluation. CYFAR Principal Investigators (PIs) were invited to respond to online surveys in 2015 and 2016. The surveys were designed to assess PI attitudes towards the nature and quality of support that they receive from their coaches. CYFAR PIs reported that their coaches have incorporated a range of coaching skills and techniques into their work. PIs have generally positive attitudes towards their coaches, and these attitudes have become more positive over time. Results suggest that CYFAR PIs have been generally supportive of the new TA system. Factors that may have facilitated support include a strong emphasis on team-building and the provision of specific resources that support program design, implementation, and evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Emami, Hassan; Radfar, Reza
2017-01-01
The current situation in Iran suggests an appropriate basis for developing biotechnology industries, because the patents for the majority of hi-tech medicines registered in developed countries are ending. Biosimilar and technology-oriented companies which do not have patents will have the opportunity to enter the biosimilar market and move toward innovative initiatives. The present research proposed a model by which one can evaluate commercialization of achievements obtained from research with a focus on the pharmaceutical biotechnology industry. This is a descriptive-analytic study where mixed methodology is followed by a heuristic approach. The statistical population was pharmaceutical biotechnology experts at universities and research centers in Iran. Structural equations were employed in this research. The results indicate that there are three effective layers within commercialization in the proposed model. These are a general layer (factors associated with management, human capital, legal infrastructure, communication infrastructure, a technical and executive infrastructures, and financial factors), industrial layer (internal industrial factors and pharmaceutical industry factors), and a third layer that included national and international aspects. These layers comprise 6 domains, 21 indices, 41 dimensions, and 126 components. Compilation of these layers (general layer, industrial layer, and national and international aspects) can serve commercialization of research and development as an effective evaluation package.
Propagation of flat-topped multi-Gaussian beams through a double-lens system with apertures.
Gao, Yanqi; Zhu, Baoqiang; Liu, Daizhong; Lin, Zunqi
2009-07-20
A general model for different apertures and flat-topped laser beams based on the multi-Gaussian function is developed. The general analytical expression for the propagation of a flat-topped beam through a general double-lens system with apertures is derived using the above model. Then, the propagation characteristics of the flat-topped beam through a spatial filter are investigated by using a simplified analytical expression. Based on the Fluence beam contrast and the Fill factor, the influences of a pinhole size on the propagation of the flat-topped multi-Gaussian beam (FMGB) through the spatial filter are illustrated. An analytical expression for the propagation of the FMGB through the spatial filter with a misaligned pinhole is presented, and the influences of the pinhole offset are evaluated.
Heddam, Salim
2014-11-01
The prediction of colored dissolved organic matter (CDOM) using artificial neural network approaches has received little attention in the past few decades. In this study, colored dissolved organic matter (CDOM) was modeled using generalized regression neural network (GRNN) and multiple linear regression (MLR) models as a function of Water temperature (TE), pH, specific conductance (SC), and turbidity (TU). Evaluation of the prediction accuracy of the models is based on the root mean square error (RMSE), mean absolute error (MAE), coefficient of correlation (CC), and Willmott's index of agreement (d). The results indicated that GRNN can be applied successfully for prediction of colored dissolved organic matter (CDOM).
Contraction Options and Optimal Multiple-Stopping in Spectrally Negative Lévy Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamazaki, Kazutoshi, E-mail: kyamazak@kansai-u.ac.jp
This paper studies the optimal multiple-stopping problem arising in the context of the timing option to withdraw from a project in stages. The profits are driven by a general spectrally negative Lévy process. This allows the model to incorporate sudden declines of the project values, generalizing greatly the classical geometric Brownian motion model. We solve the one-stage case as well as the extension to the multiple-stage case. The optimal stopping times are of threshold-type and the value function admits an expression in terms of the scale function. A series of numerical experiments are conducted to verify the optimality and tomore » evaluate the efficiency of the algorithm.« less
ERIC Educational Resources Information Center
Truckenmiller, James L.
The former HEW National Strategy for Youth Development model was a community-based planning and procedural tool to enhance and to prevent delinquency through a process of youth needs assessments, needs targeted programs, and program impact evaluation. The program's 12 Impact Scales have been found to have acceptable reliabilities, substantial…
ERIC Educational Resources Information Center
Gupta, Saurabh; Tracey, Terence J. G.; Gore, Paul A., Jr.
2008-01-01
The structural validity of Holland's model of vocational interests across racial/ethnic groups was examined in the population of high school juniors in two states. The fit of the circumplex model to Holland's RIASEC types as assessed by the UNIACT-R was evaluated for the general sample and five subgroups: Caucasian/Euro-Americans, African…
A re-evaluation of a case-control model with contaminated controls for resource selection studies
Christopher T. Rota; Joshua J. Millspaugh; Dylan C. Kesler; Chad P. Lehman; Mark A. Rumble; Catherine M. B. Jachowski
2013-01-01
A common sampling design in resource selection studies involves measuring resource attributes at sample units used by an animal and at sample units considered available for use. Few models can estimate the absolute probability of using a sample unit from such data, but such approaches are generally preferred over statistical methods that estimate a relative probability...
A bioenergetic model for zebrafish Danio rerio (Hamilton)
Chizinski, C.J.; Sharma, Bibek; Pope, K.L.; Patino, R.
2008-01-01
A bioenergetics model was developed from observed consumption, respiration and growth rates for zebrafish Danio rerio across a range (18-32?? C) of water temperatures, and evaluated with a 50 day laboratory trial at 28?? C. No significant bias in variable estimates was found during the validation trial; namely, predicted zebrafish mass generally agreed with observed mass. ?? 2008 The Authors.
ERIC Educational Resources Information Center
Slisko, Josip; Cruz, Adrian Corona
2013-01-01
There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…
Correspondence between spanning trees and the Ising model on a square lattice
NASA Astrophysics Data System (ADS)
Viswanathan, G. M.
2017-06-01
An important problem in statistical physics concerns the fascinating connections between partition functions of lattice models studied in equilibrium statistical mechanics on the one hand and graph theoretical enumeration problems on the other hand. We investigate the nature of the relationship between the number of spanning trees and the partition function of the Ising model on the square lattice. The spanning tree generating function T (z ) gives the spanning tree constant when evaluated at z =1 , while giving the lattice green function when differentiated. It is known that for the infinite square lattice the partition function Z (K ) of the Ising model evaluated at the critical temperature K =Kc is related to T (1 ) . Here we show that this idea in fact generalizes to all real temperatures. We prove that [Z(K ) s e c h 2 K ] 2=k exp[T (k )] , where k =2 tanh(2 K )s e c h (2 K ) . The identical Mahler measure connects the two seemingly disparate quantities T (z ) and Z (K ) . In turn, the Mahler measure is determined by the random walk structure function. Finally, we show that the the above correspondence does not generalize in a straightforward manner to nonplanar lattices.
Evaluation of computing systems using functionals of a Stochastic process
NASA Technical Reports Server (NTRS)
Meyer, J. F.; Wu, L. T.
1980-01-01
An intermediate model was used to represent the probabilistic nature of a total system at a level which is higher than the base model and thus closer to the performance variable. A class of intermediate models, which are generally referred to as functionals of a Markov process, were considered. A closed form solution of performability for the case where performance is identified with the minimum value of a functional was developed.
NASA Technical Reports Server (NTRS)
Sevart, F. D.; Patel, S. M.; Wattman, W. J.
1972-01-01
Testing and evaluation of stability augmentation systems for aircraft flight control were conducted. The flutter suppression system analysis of a scale supersonic transport wing model is described. Mechanization of the flutter suppression system is reported. The ride control synthesis for the B-52 aeroelastic model is discussed. Model analyses were conducted using equations of motion generated from generalized mass and stiffness data.
GEE-Smoothing Spline in Semiparametric Model with Correlated Nominal Data
NASA Astrophysics Data System (ADS)
Ibrahim, Noor Akma; Suliadi
2010-11-01
In this paper we propose GEE-Smoothing spline in the estimation of semiparametric models with correlated nominal data. The method can be seen as an extension of parametric generalized estimating equation to semiparametric models. The nonparametric component is estimated using smoothing spline specifically the natural cubic spline. We use profile algorithm in the estimation of both parametric and nonparametric components. The properties of the estimators are evaluated using simulation studies.
Pilots Rate Augmented Generalized Predictive Control for Reconfiguration
NASA Technical Reports Server (NTRS)
Soloway, Don; Haley, Pam
2004-01-01
The objective of this paper is to report the results from the research being conducted in reconfigurable fight controls at NASA Ames. A study was conducted with three NASA Dryden test pilots to evaluate two approaches of reconfiguring an aircraft's control system when failures occur in the control surfaces and engine. NASA Ames is investigating both a Neural Generalized Predictive Control scheme and a Neural Network based Dynamic Inverse controller. This paper highlights the Predictive Control scheme where a simple augmentation to reduce zero steady-state error led to the neural network predictor model becoming redundant for the task. Instead of using a neural network predictor model, a nominal single point linear model was used and then augmented with an error corrector. This paper shows that the Generalized Predictive Controller and the Dynamic Inverse Neural Network controller perform equally well at reconfiguration, but with less rate requirements from the actuators. Also presented are the pilot ratings for each controller for various failure scenarios and two samples of the required control actuation during reconfiguration. Finally, the paper concludes by stepping through the Generalized Predictive Control's reconfiguration process for an elevator failure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tessum, C. W.; Hill, J. D.; Marshall, J. D.
We present results from and evaluate the performance of a 12-month, 12 km horizontal resolution year 2005 air pollution simulation for the contiguous United States using the WRF-Chem (Weather Research and Forecasting with Chemistry) meteorology and chemical transport model (CTM). We employ the 2005 US National Emissions Inventory, the Regional Atmospheric Chemistry Mechanism (RACM), and the Modal Aerosol Dynamics Model for Europe (MADE) with a volatility basis set (VBS) secondary aerosol module. Overall, model performance is comparable to contemporary modeling efforts used for regulatory and health-effects analysis, with an annual average daytime ozone (O 3) mean fractional bias (MFB) ofmore » 12% and an annual average fine particulate matter (PM 2.5) MFB of −1%. WRF-Chem, as configured here, tends to overpredict total PM 2.5 at some high concentration locations and generally overpredicts average 24 h O 3 concentrations. Performance is better at predicting daytime-average and daily peak O 3 concentrations, which are more relevant for regulatory and health effects analyses relative to annual average values. Predictive performance for PM 2.5 subspecies is mixed: the model overpredicts particulate sulfate (MFB = 36%), underpredicts particulate nitrate (MFB = −110%) and organic carbon (MFB = −29%), and relatively accurately predicts particulate ammonium (MFB = 3%) and elemental carbon (MFB = 3%), so that the accuracy in total PM 2.5 predictions is to some extent a function of offsetting over- and underpredictions of PM 2.5 subspecies. Model predictive performance for PM 2.5 and its subspecies is in general worse in winter and in the western US than in other seasons and regions, suggesting spatial and temporal opportunities for future WRF-Chem model development and evaluation.« less
Tessum, C. W.; Hill, J. D.; Marshall, J. D.
2015-04-07
We present results from and evaluate the performance of a 12-month, 12 km horizontal resolution year 2005 air pollution simulation for the contiguous United States using the WRF-Chem (Weather Research and Forecasting with Chemistry) meteorology and chemical transport model (CTM). We employ the 2005 US National Emissions Inventory, the Regional Atmospheric Chemistry Mechanism (RACM), and the Modal Aerosol Dynamics Model for Europe (MADE) with a volatility basis set (VBS) secondary aerosol module. Overall, model performance is comparable to contemporary modeling efforts used for regulatory and health-effects analysis, with an annual average daytime ozone (O 3) mean fractional bias (MFB) ofmore » 12% and an annual average fine particulate matter (PM 2.5) MFB of −1%. WRF-Chem, as configured here, tends to overpredict total PM 2.5 at some high concentration locations and generally overpredicts average 24 h O 3 concentrations. Performance is better at predicting daytime-average and daily peak O 3 concentrations, which are more relevant for regulatory and health effects analyses relative to annual average values. Predictive performance for PM 2.5 subspecies is mixed: the model overpredicts particulate sulfate (MFB = 36%), underpredicts particulate nitrate (MFB = −110%) and organic carbon (MFB = −29%), and relatively accurately predicts particulate ammonium (MFB = 3%) and elemental carbon (MFB = 3%), so that the accuracy in total PM 2.5 predictions is to some extent a function of offsetting over- and underpredictions of PM 2.5 subspecies. Model predictive performance for PM 2.5 and its subspecies is in general worse in winter and in the western US than in other seasons and regions, suggesting spatial and temporal opportunities for future WRF-Chem model development and evaluation.« less
Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L
2014-01-01
Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.
Uncertainty vs. Information (Invited)
NASA Astrophysics Data System (ADS)
Nearing, Grey
2017-04-01
Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Paris, Isbelle L.; OBrien, T. Kevin; Minguet, Pierre J.
2004-01-01
The influence of two-dimensional finite element modeling assumptions on the debonding prediction for skin-stiffener specimens was investigated. Geometrically nonlinear finite element analyses using two-dimensional plane-stress and plane-strain elements as well as three different generalized plane strain type approaches were performed. The computed skin and flange strains, transverse tensile stresses and energy release rates were compared to results obtained from three-dimensional simulations. The study showed that for strains and energy release rate computations the generalized plane strain assumptions yielded results closest to the full three-dimensional analysis. For computed transverse tensile stresses the plane stress assumption gave the best agreement. Based on this study it is recommended that results from plane stress and plane strain models be used as upper and lower bounds. The results from generalized plane strain models fall between the results obtained from plane stress and plane strain models. Two-dimensional models may also be used to qualitatively evaluate the stress distribution in a ply and the variation of energy release rates and mixed mode ratios with delamination length. For more accurate predictions, however, a three-dimensional analysis is required.
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Minguet, Pierre J.; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
The influence of two-dimensional finite element modeling assumptions on the debonding prediction for skin-stiffener specimens was investigated. Geometrically nonlinear finite element analyses using two-dimensional plane-stress and plane strain elements as well as three different generalized plane strain type approaches were performed. The computed deflections, skin and flange strains, transverse tensile stresses and energy release rates were compared to results obtained from three-dimensional simulations. The study showed that for strains and energy release rate computations the generalized plane strain assumptions yielded results closest to the full three-dimensional analysis. For computed transverse tensile stresses the plane stress assumption gave the best agreement. Based on this study it is recommended that results from plane stress and plane strain models be used as upper and lower bounds. The results from generalized plane strain models fall between the results obtained from plane stress and plane strain models. Two-dimensional models may also be used to qualitatively evaluate the stress distribution in a ply and the variation of energy release rates and mixed mode ratios with lamination length. For more accurate predictions, however, a three-dimensional analysis is required.
Wu, Taotao; Kim, Taewung; Bollapragada, Varun; Poulard, David; Chen, Huipeng; Panzer, Matthew B; Forman, Jason L; Crandall, Jeff R; Pipkorn, Bengt
2017-05-29
The goal of this study was to evaluate the biofidelity of the Total Human Model for Safety (THUMS; Ver. 4.01) pedestrian finite element models (PFEM) in a whole-body pedestrian impact condition using a well-characterized generic pedestrian buck model. The biofidelity of THUMS PFEM was evaluated with respect to data from 3 full-scale postmortem human subject (PMHS) pedestrian impact tests, in which a pedestrian buck laterally struck the subjects using a pedestrian buck at 40 km/h. The pedestrian model was scaled to match the anthropometry of the target subjects and then positioned to match the pre-impact postures of the target subjects based on the 3-dimensional motion tracking data obtained during the experiments. An objective rating method was employed to quantitatively evaluate the correlation between the responses of the models and the PMHS. Injuries in the models were predicted both probabilistically and deterministically using empirical injury risk functions and strain measures, respectively, and compared with those of the target PMHS. In general, the model exhibited biofidelic kinematic responses (in the Y-Z plane) regarding trajectories (International Organization for Standardization [ISO] ratings: Y = 0.90 ± 0.11, Z = 0.89 ± 0.09), linear resultant velocities (ISO ratings: 0.83 ± 0.07), accelerations (ISO ratings: Y = 0.58 ± 0.11, Z = 0.52 ± 0.12), and angular velocities (ISO ratings: X = 0.48 ± 0.13) but exhibited stiffer leg responses and delayed head responses compared to those of the PMHS. This indicates potential biofidelity issues with the PFEM for regions below the knee and in the neck. The model also demonstrated comparable reaction forces at the buck front-end regions to those from the PMHS tests. The PFEM generally predicted the injuries that the PMHS sustained but overestimated injuries in the ankle and leg regions. Based on the data considered, the THUMS PFEM was considered to be biofidelic for this pedestrian impact condition and vehicle. Given the capability of the model to reproduce biomechanical responses, it shows potential as a valuable tool for developing novel pedestrian safety systems.
The implications of rebasing global mean temperature timeseries for GCM based climate projections
NASA Astrophysics Data System (ADS)
Stainforth, David; Chapman, Sandra; Watkins, Nicholas
2017-04-01
Global climate and earth system models are assessed by comparison with observations through a number of metrics. The InterGovernmental Panel on Climate Change (IPCC) highlights in particular their ability to reproduce "general features of the global and annual mean surface temperature changes over the historical period" [1,2] and to simulate "a trend in global-mean surface temperature from 1951 to 2012 that agrees with the observed trend" [3]. This focus on annual mean global mean temperature (hereafter GMT) change is presented as an important element in demonstrating the relevance of these models for climate projections. Any new model or new model version whose historic simulations fail to reproduce the "general features " and 20th century trends is likely therefore to undergo further tuning. Thus this focus could have implications for model development. Here we consider a formal interpretation of "general features" and discuss the implications of this approach to model assessment and intercomparison, for the interpretation of GCM projections. Following the IPCC, we interpret a major element of "general features" as being the slow timescale response to external forcings. (Shorter timescale behaviour such as the response to volcanic eruptions are also elements of "general features" but are not considered here.) Also following the IPCC, we consider only GMT anomalies i.e. changes with respect to some period. Since the models have absolute temperatures which range over about 3K (roughly observed GMT +/- 1.5K) this means their timeseries (and the observations) are rebased. We present timeseries of the slow timescale response of the CMIP5 models rebased to late-20th century temperatures and to mid-19th century temperatures. We provide a mathematical interpretation of this approach to model assessment and discuss two consequences. First is a separation of scales which limits the degree to which sub-global behaviour can feedback on the global response. Second, is an implication of linearity in the GMT response (to the extent that the slow-timescale response of the historic simulations is consistent with observations, and given their uncertainties). For each individual model these consequences only apply over the range of absolute temperatures simulated by the model in historic simulations. Taken together, however, they imply consequences over a much wider range of GMTs. The analysis suggests that this aspect of model evaluation risks providing a model development pressure which acts against a wide exploration of physically plausible responses; in particular against an exploration of potentially globally significant nonlinear responses and feedbacks. [1] IPCC, Fifth Assessment Report, Working Group 1, Technical Summary: Stocker et al. 2013. [2] IPCC, Fifth Assessment Report, Working Group 1, Chapter 9 - "Evaluation of Climate Models": Flato et al. 2013. [3] IPCC, Fifth Assessment Report, Working Group 1, Summary for Policy Makers: IPCC, 2013.
Ren, Anna N; Neher, Robert E; Bell, Tyler; Grimm, James
2018-06-01
Preoperative planning is important to achieve successful implantation in primary total knee arthroplasty (TKA). However, traditional TKA templating techniques are not accurate enough to predict the component size to a very close range. With the goal of developing a general predictive statistical model using patient demographic information, ordinal logistic regression was applied to build a proportional odds model to predict the tibia component size. The study retrospectively collected the data of 1992 primary Persona Knee System TKA procedures. Of them, 199 procedures were randomly selected as testing data and the rest of the data were randomly partitioned between model training data and model evaluation data with a ratio of 7:3. Different models were trained and evaluated on the training and validation data sets after data exploration. The final model had patient gender, age, weight, and height as independent variables and predicted the tibia size within 1 size difference 96% of the time on the validation data, 94% of the time on the testing data, and 92% on a prospective cadaver data set. The study results indicated the statistical model built by ordinal logistic regression can increase the accuracy of tibia sizing information for Persona Knee preoperative templating. This research shows statistical modeling may be used with radiographs to dramatically enhance the templating accuracy, efficiency, and quality. In general, this methodology can be applied to other TKA products when the data are applicable. Copyright © 2018 Elsevier Inc. All rights reserved.
Evaluation of a habitat suitability index model
Farmer, A.H.; Cade, B.S.; Stauffer, D.F.
2002-01-01
We assisted with development of a model for maternity habitat of the Indiana bat (Myotis soda/is), for use in conducting assessments of projects potentially impacting this endangered species. We started with an existing model, modified that model in a workshop, and evaluated the revised model, using data previously collected by others. Our analyses showed that higher indices of habitat suitability were associated with sites where Indiana bats were present and, thus, the model may be useful for identifying suitable habitat. Utility of the model, however, was based on a single component-density of suitable roost trees. Percentage of landscape in forest did not allow differentiation between sites occupied and not occupied by Indiana bats. Moreover, in spite of a general opinion by participants in the workshop that bodies of water were highly productive feeding areas and that a diversity of feeding habitats was optimal, we found no evidence to support either hypothesis.
Pajoutan, Mojdeh; Cavuoto, Lora A; Mehta, Ranjana K
2017-10-01
This study evaluates whether the existing force-endurance relationship models are predictive of endurance time for overweight and obese individuals, and if not, provide revised models that can be applied for ergonomics practice. Data was collected from 141 participants (49 normal weight, 50 overweight, 42 obese) who each performed isometric endurance tasks of hand grip, shoulder flexion, and trunk extension at four levels of relative workload. Subject-specific fatigue rates and a general model of the force-endurance relationship were determined and compared to two fatigue models from the literature. There was a lack of fit between previous models and the current data for the grip (ICC = 0.8), with a shift toward lower endurance times for the new data. Application of the revised models can facilitate improved workplace design and job evaluation to accommodate the capacities of the current workforce.
The structure of evaporating and combusting sprays: Measurements and predictions
NASA Technical Reports Server (NTRS)
Shuen, J. S.; Solomon, A. S. P.; Faeth, G. M.
1982-01-01
An apparatus was constructed to provide measurements in open sprays with no zones of recirculation, in order to provide well-defined conditions for use in evaluating spray models. Measurements were completed in a gas jet, in order to test experimental methods, and are currently in progress for nonevaporating sprays. A locally homogeneous flow (LHF) model where interphase transport rates are assumed to be infinitely fast; a separated flow (SF) model which allows for finite interphase transport rates but neglects effects of turbulent fluctuations on drop motion; and a stochastic SF model which considers effects of turbulent fluctuations on drop motion were evaluated using existing data on particle-laden jets. The LHF model generally overestimates rates of particle dispersion while the SF model underestimates dispersion rates. The stochastic SF flow yield satisfactory predictions except at high particle mass loadings where effects of turbulence modulation may have caused the model to overestimate turbulence levels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Serell, D.C.; Kaplan, S.
1980-09-01
Purpose of this evaluation is to estimate the magnitude and effects of irradiation and creep induced fuel bundle deformations in the developmental plant. This report focuses on the trends of the results and the ability of present models to evaluate the assembly temperatures in the presence of bundle deformation. Although this analysis focuses on the developmental plant, the conclusions are applicable to LMFBR fuel assemblies in general if they have wire spacers.
Mota, L F M; Martins, P G M A; Littiere, T O; Abreu, L R A; Silva, M A; Bonafé, C M
2018-04-01
The objective was to estimate (co)variance functions using random regression models (RRM) with Legendre polynomials, B-spline function and multi-trait models aimed at evaluating genetic parameters of growth traits in meat-type quail. A database containing the complete pedigree information of 7000 meat-type quail was utilized. The models included the fixed effects of contemporary group and generation. Direct additive genetic and permanent environmental effects, considered as random, were modeled using B-spline functions considering quadratic and cubic polynomials for each individual segment, and Legendre polynomials for age. Residual variances were grouped in four age classes. Direct additive genetic and permanent environmental effects were modeled using 2 to 4 segments and were modeled by Legendre polynomial with orders of fit ranging from 2 to 4. The model with quadratic B-spline adjustment, using four segments for direct additive genetic and permanent environmental effects, was the most appropriate and parsimonious to describe the covariance structure of the data. The RRM using Legendre polynomials presented an underestimation of the residual variance. Lesser heritability estimates were observed for multi-trait models in comparison with RRM for the evaluated ages. In general, the genetic correlations between measures of BW from hatching to 35 days of age decreased as the range between the evaluated ages increased. Genetic trend for BW was positive and significant along the selection generations. The genetic response to selection for BW in the evaluated ages presented greater values for RRM compared with multi-trait models. In summary, RRM using B-spline functions with four residual variance classes and segments were the best fit for genetic evaluation of growth traits in meat-type quail. In conclusion, RRM should be considered in genetic evaluation of breeding programs.
A generalized estimating equations approach for resting-state functional MRI group analysis.
D'Angelo, Gina M; Lazar, Nicole A; Eddy, William F; Morris, John C; Sheline, Yvette I
2011-01-01
An Alzheimer's fMRI study has motivated us to evaluate inter-regional correlations between groups. The overall objective is to assess inter-regional correlations at a resting-state with no stimulus or task. We propose using a generalized estimating equation (GEE) transition model and a GEE marginal model to model the within-subject correlation for each region. Residuals calculated from the GEE models are used to correlate brain regions and assess between group differences. The standard pooling approach of group averages of the Fisher-z transformation assuming temporal independence is a typical approach used to compare group correlations. The GEE approaches and standard Fisher-z pooling approach are demonstrated with an Alzheimer's disease (AD) connectivity study in a population of AD subjects and healthy control subjects. We also compare these methods using simulation studies and show that the transition model may have better statistical properties.
Integrable time-dependent Hamiltonians, solvable Landau-Zener models and Gaudin magnets
NASA Astrophysics Data System (ADS)
Yuzbashyan, Emil A.
2018-05-01
We solve the non-stationary Schrödinger equation for several time-dependent Hamiltonians, such as the BCS Hamiltonian with an interaction strength inversely proportional to time, periodically driven BCS and linearly driven inhomogeneous Dicke models as well as various multi-level Landau-Zener tunneling models. The latter are Demkov-Osherov, bow-tie, and generalized bow-tie models. We show that these Landau-Zener problems and their certain interacting many-body generalizations map to Gaudin magnets in a magnetic field. Moreover, we demonstrate that the time-dependent Schrödinger equation for the above models has a similar structure and is integrable with a similar technique as Knizhnik-Zamolodchikov equations. We also discuss applications of our results to the problem of molecular production in an atomic Fermi gas swept through a Feshbach resonance and to the evaluation of the Landau-Zener transition probabilities.
Li, Yuelin; Baser, Ray
2013-01-01
The US Food and Drug Administration recently announced the final guidelines on the development and validation of Patient-Reported Outcomes (PROs) assessments in drug labeling and clinical trials. This guidance paper may boost the demand for new PRO survey questionnaires. Henceforth biostatisticians may encounter psychometric methods more frequently, particularly Item Response Theory (IRT) models to guide the shortening of a PRO assessment instrument. This article aims to provide an introduction on the theory and practical analytic skills in fitting a Generalized Partial Credit Model in IRT (GPCM). GPCM theory is explained first, with special attention to a clearer exposition of the formal mathematics than what is typically available in the psychometric literature. Then a worked example is presented, using self-reported responses taken from the International Personality Item Pool. The worked example contains step-by-step guides on using the statistical languages R and WinBUGS in fitting the GPCM. Finally, the Fisher information function of the GPCM model is derived and used to evaluate, as an illustrative example, the usefulness of assessment items by their information contents. This article aims to encourage biostatisticians to apply IRT models in the re-analysis of existing data and in future research. PMID:22362655
Li, Yuelin; Baser, Ray
2012-08-15
The US Food and Drug Administration recently announced the final guidelines on the development and validation of patient-reported outcomes (PROs) assessments in drug labeling and clinical trials. This guidance paper may boost the demand for new PRO survey questionnaires. Henceforth, biostatisticians may encounter psychometric methods more frequently, particularly item response theory (IRT) models to guide the shortening of a PRO assessment instrument. This article aims to provide an introduction on the theory and practical analytic skills in fitting a generalized partial credit model (GPCM) in IRT. GPCM theory is explained first, with special attention to a clearer exposition of the formal mathematics than what is typically available in the psychometric literature. Then, a worked example is presented, using self-reported responses taken from the international personality item pool. The worked example contains step-by-step guides on using the statistical languages r and WinBUGS in fitting the GPCM. Finally, the Fisher information function of the GPCM model is derived and used to evaluate, as an illustrative example, the usefulness of assessment items by their information contents. This article aims to encourage biostatisticians to apply IRT models in the re-analysis of existing data and in future research. Copyright © 2012 John Wiley & Sons, Ltd.
Applying the scientific method to small catchment studies: Areview of the Panola Mountain experience
Hooper, R.P.
2001-01-01
A hallmark of the scientific method is its iterative application to a problem to increase and refine the understanding of the underlying processes controlling it. A successful iterative application of the scientific method to catchment science (including the fields of hillslope hydrology and biogeochemistry) has been hindered by two factors. First, the scale at which controlled experiments can be performed is much smaller than the scale of the phenomenon of interest. Second, computer simulation models generally have not been used as hypothesis-testing tools as rigorously as they might have been. Model evaluation often has gone only so far as evaluation of goodness of fit, rather than a full structural analysis, which is more useful when treating the model as a hypothesis. An iterative application of a simple mixing model to the Panola Mountain Research Watershed is reviewed to illustrate the increase in understanding gained by this approach and to discern general principles that may be applicable to other studies. The lessons learned include the need for an explicitly stated conceptual model of the catchment, the definition of objective measures of its applicability, and a clear linkage between the scale of observations and the scale of predictions. Published in 2001 by John Wiley & Sons. Ltd.
NASA Technical Reports Server (NTRS)
Gregg, Watson W.; Busalacchi, Antonio (Technical Monitor)
2000-01-01
A coupled ocean general circulation, biogeochemical, and radiative model was constructed to evaluate and understand the nature of seasonal variability of chlorophyll and nutrients in the global oceans. Biogeochemical processes in the model are determined from the influences of circulation and turbulence dynamics, irradiance availability. and the interactions among three functional phytoplankton groups (diatoms. chlorophytes, and picoplankton) and three nutrients (nitrate, ammonium, and silicate). Basin scale (greater than 1000 km) model chlorophyll results are in overall agreement with CZCS pigments in many global regions. Seasonal variability observed in the CZCS is also represented in the model. Synoptic scale (100-1000 km) comparisons of imagery are generally in conformance although occasional departures are apparent. Model nitrate distributions agree with in situ data, including seasonal dynamics, except for the equatorial Atlantic. The overall agreement of the model with satellite and in situ data sources indicates that the model dynamics offer a reasonably realistic simulation of phytoplankton and nutrient dynamics on synoptic scales. This is especially true given that initial conditions are homogenous chlorophyll fields. The success of the model in producing a reasonable representation of chlorophyll and nutrient distributions and seasonal variability in the global oceans is attributed to the application of a generalized, processes-driven approach as opposed to regional parameterization and the existence of multiple phytoplankton groups with different physiological and physical properties. These factors enable the model to simultaneously represent many aspects of the great diversity of physical, biological, chemical, and radiative environments encountered in the global oceans.
Lee, SoYean; Burns, G Leonard; Beauchaine, Theodore P; Becker, Stephen P
2016-08-01
The objective was to determine if the latent structure of attention-deficit/hyperactivity disorder (ADHD) and oppositional defiant disorder (ODD) symptoms is best explained by a general disruptive behavior factor along with specific inattention (IN), hyperactivity/impulsivity (HI), and ODD factors (a bifactor model) whereas the latent structure of sluggish cognitive tempo (SCT) symptoms is best explained by a first-order factor independent of the bifactor model of ADHD/ODD. Parents' (n = 703) and teachers' (n = 366) ratings of SCT, ADHD-IN, ADHD-HI, and ODD symptoms on the Child and Adolescent Disruptive Behavior Inventory (CADBI) in a community sample of children (ages 5-13; 55% girls) were used to evaluate 4 models of symptom organization. Results indicated that a bifactor model of ADHD/ODD symptoms, in conjunction with a separate first-order SCT factor, was the best model for both parent and teacher ratings. The first-order SCT factor showed discriminant validity with the general disruptive behavior and specific IN factors in the bifactor model. In addition, higher scores on the SCT factor predicted greater academic and social impairment, even after controlling for the general disruptive behavior and 3 specific factors. Consistent with predictions from the trait-impulsivity etiological model of externalizing liability, a single, general disruptive behavior factor accounted for nearly all common variance in ADHD/ODD symptoms, whereas SCT symptoms represented a factor different from the general disruptive behavior and specific IN factor. These results provide additional support for distinguishing between SCT and ADHD-IN. The study also demonstrates how etiological models can be used to predict specific latent structures of symptom organization. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Forecasting volatility with neural regression: a contribution to model adequacy.
Refenes, A N; Holt, W T
2001-01-01
Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.
Immunotoxicant screening and prioritization in the 21st century
Current immunotoxicity testing guidance for drugs, high production volume chemicals and pesticides specifies the use of animal models that measure immune function or evaluation of general indicators of immune system health generated in routine toxicity testing. The assays are ...
Immunotoxicant screening and prioritization in the 21st century*
Current immunotoxicity testing guidance for drugs, high production volume chemicals and pesticides specifies the use of animal models that measure immune function or evaluation of general indicators of immune system health generated in routine toxicity testing. The assays are r...
NASA Astrophysics Data System (ADS)
He, Hong-di; Lu, Wei-Zhen; Xue, Yu
2009-12-01
At urban traffic intersections, vehicles frequently stop with idling engines during the red-light period and speed up rapidly during the green-light period. The changes of driving patterns (i.e., idle, acceleration, deceleration and cruising patterns) generally produce uncertain emission. Additionally, the movement of pedestrians and the influence of wind further result in the random dispersion of pollutants. It is, therefore, too complex to simulate the effects of such dynamics on the resulting emission using conventional deterministic causal models. For this reason, a modified semi-empirical box model for predicting the PM 10 concentrations on roadsides is proposed in this paper. The model constitutes three parts, i.e., traffic, emission and dispersion components. The traffic component is developed using a generalized force traffic model to obtain the instantaneous velocity and acceleration when vehicles move through intersections. Hence the distribution of vehicle emission in street canyon during the green-light period is calculated. Then the dispersion component is investigated using a semi-empirical box model combining average wind speed, box height and background concentrations. With these considerations, the proposed model is applied and evaluated using measured data at a busy traffic intersection in Mong Kok, Hong Kong. In order to test the performance of the model, two situations, i.e., the data sets within a sunny day and between two sunny days, were selected to examine the model performance. The predicted values are generally well coincident with the observed data during different time slots except several values are overestimated or underestimated. Moreover, two types of vehicles, i.e., buses and petrol cars, are separately taken into account in the study. Buses are verified to contribute most to the emission in street canyons, which may be useful in evaluating the impact of vehicle emissions on the ambient air quality when there is a significant change in a specific vehicular population.
NASA Technical Reports Server (NTRS)
Pavel, M.
1993-01-01
This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.
Hentschel, Annett G; Livesley, W John
2013-01-01
Recent developments in the classification of personality disorder, especially moves toward more dimensional systems, create the need to assess general personality disorder apart from individual differences in personality pathology. The General Assessment of Personality Disorder (GAPD) is a self-report questionnaire designed to evaluate general personality disorder. The measure evaluates 2 major components of disordered personality: self or identity problems and interpersonal dysfunction. This study explores whether there is a single factor reflecting general personality pathology as proposed by the Diagnostic and Statistical Manual of Mental Disorders (5th ed.), whether self-pathology has incremental validity over interpersonal pathology as measured by GAPD, and whether GAPD scales relate significantly to Diagnostic and Statistical Manual of Mental Disorders (4th ed. [DSM-IV]) personality disorders. Based on responses from a German psychiatric sample of 149 participants, parallel analysis yielded a 1-factor model. Self Pathology scales of the GAPD increased the predictive validity of the Interpersonal Pathology scales of the GAPD. The GAPD scales showed a moderate to high correlation for 9 of 12 DSM-IV personality disorders.
Further evaluation of a brief, intensive teacher-training model.
Lerman, Dorothea C; Tetreault, Allison; Hovanetz, Alyson; Strobel, Margaret; Garro, Joanie
2008-01-01
The purpose of this study was to further evaluate the outcomes of a model program that was designed to train current teachers of children with autism. Nine certified special education teachers participating in an intensive 5-day summer training program were taught a relatively large number of specific skills in two areas (preference assessment and direct teaching). The teachers met the mastery criteria for all of the skills during the summer training. Follow-up observations up to 6 months after training suggested that the skills generalized to their classrooms and were maintained for most teachers with brief feedback only.
Camargo, Ana Luiza Lourenço Simões; Maluf Neto, Alfredo; Colman, Fátima Tahira; Citero, Vanessa de Albuquerque
2015-01-01
There is high prevalence of mental and behavioral disorders in general hospitals, thus triggering psychiatric risk situations. This study aimed to develop a psychiatric risk assessment checklist and routine for nurses, the Psychiatric Risk Evaluation Check-List (PRE-CL), as an alternative model for early identification and management of these situations in general hospitals. Ethnographic qualitative study in a tertiary-level private hospital. Three hundred general-unit nurses participated in the study. Reports were gathered through open groups conducted by a trained nurse, at shift changes for two months. The questions used were: "Would you consider it helpful to discuss daily practice situations with a psychiatrist? Which situations?" The data were qualitatively analyzed through an ethnographic approach. The nurses considered it useful to discuss daily practice situations relating to mental and behavioral disorders with a psychiatrist. Their reports were used to develop PRE-CL, within the patient overall risk assessment routine for all inpatients within 24 hours after admission and every 48 hours thereafter. Whenever one item was present, the psychosomatic medicine team was notified. They went to the unit, gathered data from the nurses, patient files and, if necessary, attending doctors, and decided on the risk management: guidance, safety measures or mental health consultation. It is possible to develop a model for detecting and intervening in psychiatric and behavioral disorders at general hospitals based on nursing team observations, through a checklist that takes these observations into account and a routine inserted into daily practice.
a Model Study of Small-Scale World Map Generalization
NASA Astrophysics Data System (ADS)
Cheng, Y.; Yin, Y.; Li, C. M.; Wu, W.; Guo, P. P.; Ma, X. L.; Hu, F. M.
2018-04-01
With the globalization and rapid development every filed is taking an increasing interest in physical geography and human economics. There is a surging demand for small scale world map in large formats all over the world. Further study of automated mapping technology, especially the realization of small scale production on a large scale global map, is the key of the cartographic field need to solve. In light of this, this paper adopts the improved model (with the map and data separated) in the field of the mapmaking generalization, which can separate geographic data from mapping data from maps, mainly including cross-platform symbols and automatic map-making knowledge engine. With respect to the cross-platform symbol library, the symbol and the physical symbol in the geographic information are configured at all scale levels. With respect to automatic map-making knowledge engine consists 97 types, 1086 subtypes, 21845 basic algorithm and over 2500 relevant functional modules.In order to evaluate the accuracy and visual effect of our model towards topographic maps and thematic maps, we take the world map generalization in small scale as an example. After mapping generalization process, combining and simplifying the scattered islands make the map more explicit at 1 : 2.1 billion scale, and the map features more complete and accurate. Not only it enhance the map generalization of various scales significantly, but achieve the integration among map-makings of various scales, suggesting that this model provide a reference in cartographic generalization for various scales.
Toxicity hazard of organophosphate insecticide malathion identified by in vitro methods.
Jira, David; Janousek, Stanislav; Pikula, Jiri; Vitula, Frantisek; Kejlova, Kristina
2012-01-01
Malathion is generally not classified as toxic. However, the toxicity seems to be species-dependent. Local and systemic toxicity data for birds are rare, but a decrease of wild bird densities in areas where malathion was applied was reported. Aim of the study was to extend knowledge on malathion toxicity on cellular and organ level and to evaluate embryotoxicity and genotoxicity for birds using the chick embryo model HET-CAM. Skin and eye irritation was determined using reconstructed skin and eye cornea tissues and the chorioallantoic membrane of chick embryo to simulate conjunctiva. Cytotoxicity in 3T3 Balb/c fibroblast culture was determined to estimate acute systemic toxicity. Chick embryo model was further employed to evaluate acute embryotoxicity for birds (mortality and genotoxicity). Data were analysed by means of general linear models. Malathion is not a skin and eye irritant. Cytotoxicity in vitro test provided LD50 value of 616 mg/kg suggesting higher toxic potential than is generally published based on in vivo tests on laboratory rodents. Embryotoxicity studies revealed dose and age dependent mortality of chick embryos. Genotoxicity was identified by means of micronucleus test in erythroid cells isolated from chorioallantois vascular system of chick embryos. Using in vitro alternative toxicological methods, a higher toxic potential of malathion was demonstrated than is generally declared. An increased health and environmental hazard may occur in areas with intensive agricultural production. The environmental consequences of delayed effects and embryotoxicity for bird populations in areas exposed to organophosphate insecticides, such as malathion, are obvious.
Evaluation of Pharmacokinetic Assumptions Using a 443 ...
With the increasing availability of high-throughput and in vitro data for untested chemicals, there is a need for pharmacokinetic (PK) models for in vitro to in vivo extrapolation (IVIVE). Though some PBPK models have been created for individual compounds using in vivo data, we are now able to rapidly parameterize generic PBPK models using in vitro data to allow IVIVE for chemicals tested for bioactivity via high-throughput screening. However, these new models are expected to have limited accuracy due to their simplicity and generalization of assumptions. We evaluated the assumptions and performance of a generic PBPK model (R package “httk”) parameterized by a library of in vitro PK data for 443 chemicals. We evaluate and calibrate Schmitt’s method by comparing the predicted volume of distribution (Vd) and tissue partition coefficients to in vivo measurements. The partition coefficients are initially over predicted, likely due to overestimation of partitioning into phospholipids in tissues and the lack of lipid partitioning in the in vitro measurements of the fraction unbound in plasma. Correcting for phospholipids and plasma binding improved the predictive ability (R2 to 0.52 for partition coefficients and 0.32 for Vd). We lacked enough data to evaluate the accuracy of changing the model structure to include tissue blood volumes and/or separate compartments for richly/poorly perfused tissues, therefore we evaluated the impact of these changes on model
Alpine, Lucy M; Caldas, Francieli Tanji; Barrett, Emer M
2018-04-02
The objective of the study was to investigate student and practice educator evaluations of practice placements using a structured 2 to 1 supervision and implementation model. Cross-sectional pilot study set in clinical sites providing placements for physiotherapy students in Ireland. Students and practice educators completing a 2.1 peer placement between 2013 and 2015 participated. A self-reported questionnaire which measured indicators linked to quality assured placements was used. Three open-ended questions captured comments on the benefits and challenges associated with the 2 to 1 model. Ten students (10/20; 50% response rate) and 10 practice educators (10/10; 100% response rate) responded to the questionnaire. Student responses included four pairs of students and one student from a further two pairs. There was generally positive agreement with the questionnaire indicating that placements using the 2 to 1 model were positively evaluated by participants. There were no significant differences between students and practice educators. The main benefits of the 2 to 1 model were shared learning experiences, a peer supported environment, and the development of peer evaluation and feedback skills by students. A key component of the model was the peer scripting process which provided time for reflection, self-evaluation, and peer review. 2 to 1 placements were positively evaluated by students and educators when supported by a structured supervision model. Clear guidance to students on the provision of peer feedback and support for educators providing feedback to two different students is recommended.
Evaluating Air-Quality Models: Review and Outlook.
NASA Astrophysics Data System (ADS)
Weil, J. C.; Sykes, R. I.; Venkatram, A.
1992-10-01
Over the past decade, much attention has been devoted to the evaluation of air-quality models with emphasis on model performance in predicting the high concentrations that are important in air-quality regulations. This paper stems from our belief that this practice needs to be expanded to 1) evaluate model physics and 2) deal with the large natural or stochastic variability in concentration. The variability is represented by the root-mean- square fluctuating concentration (c about the mean concentration (C) over an ensemble-a given set of meteorological, source, etc. conditions. Most air-quality models used in applications predict C, whereas observations are individual realizations drawn from an ensemble. For cC large residuals exist between predicted and observed concentrations, which confuse model evaluations.This paper addresses ways of evaluating model physics in light of the large c the focus is on elevated point-source models. Evaluation of model physics requires the separation of the mean model error-the difference between the predicted and observed C-from the natural variability. A residual analysis is shown to be an elective way of doing this. Several examples demonstrate the usefulness of residuals as well as correlation analyses and laboratory data in judging model physics.In general, c models and predictions of the probability distribution of the fluctuating concentration (c), (c, are in the developmental stage, with laboratory data playing an important role. Laboratory data from point-source plumes in a convection tank show that (c approximates a self-similar distribution along the plume center plane, a useful result in a residual analysis. At pmsent,there is one model-ARAP-that predicts C, c, and (c for point-source plumes. This model is more computationally demanding than other dispersion models (for C only) and must be demonstrated as a practical tool. However, it predicts an important quantity for applications- the uncertainty in the very high and infrequent concentrations. The uncertainty is large and is needed in evaluating operational performance and in predicting the attainment of air-quality standards.
Askew, Deborah A; Jackson, Claire L; Ware, Robert S; Russell, Anthony
2010-05-24
Type 2 Diabetes Mellitus is one of the most disabling chronic conditions worldwide, resulting in significant human, social and economic costs and placing huge demands on health care systems. The Inala Chronic Disease Management Service aims to improve the efficiency and effectiveness of care for patients with type 2 diabetes who have been referred by their general practitioner to a specialist diabetes outpatient clinic. Care is provided by a multidisciplinary, integrated team consisting of an endocrinologist, diabetes nurse educators, General Practitioner Clinical Fellows (general practitioners who have undertaken focussed post-graduate training in complex diabetes care), and allied health personnel (a dietitian, podiatrist and psychologist). Using a geographical control, this evaluation study tests the impact of this model of diabetes care provided by the service on patient outcomes compared to usual care provided at the specialist diabetes outpatient clinic. Data collection at baseline, 6 and 12-months will compare the primary outcome (glycaemic control) and secondary outcomes (serum lipid profile, blood pressure, physical activity, smoking status, quality of life, diabetes self-efficacy and cost-effectiveness). This model of diabetes care combines the patient focus and holistic care valued by the primary care sector with the specialised knowledge and skills of hospital diabetes care. Our study will provide empirical evidence about the clinical effectiveness of this model of care. Australian New Zealand Clinical Trials Registry ACTRN12608000010392.
Treweek, Shaun; Bonetti, Debbie; Maclennan, Graeme; Barnett, Karen; Eccles, Martin P; Jones, Claire; Pitts, Nigel B; Ricketts, Ian W; Sullivan, Frank; Weal, Mark; Francis, Jill J
2014-03-01
To evaluate the robustness of the intervention modeling experiment (IME) methodology as a way of developing and testing behavioral change interventions before a full-scale trial by replicating an earlier paper-based IME. Web-based questionnaire and clinical scenario study. General practitioners across Scotland were invited to complete the questionnaire and scenarios, which were then used to identify predictors of antibiotic-prescribing behavior. These predictors were compared with the predictors identified in an earlier paper-based IME and used to develop a new intervention. Two hundred seventy general practitioners completed the questionnaires and scenarios. The constructs that predicted simulated behavior and intention were attitude, perceived behavioral control, risk perception/anticipated consequences, and self-efficacy, which match the targets identified in the earlier paper-based IME. The choice of persuasive communication as an intervention in the earlier IME was also confirmed. Additionally, a new intervention, an action plan, was developed. A web-based IME replicated the findings of an earlier paper-based IME, which provides confidence in the IME methodology. The interventions will now be evaluated in the next stage of the IME, a web-based randomized controlled trial. Copyright © 2014 Elsevier Inc. All rights reserved.
A psychoanalyst-liaison psychiatrist's overview of DSM III.
Grossman, S
1982-12-01
There has been a groundswell of reaction, mostly favorable, to the most recent edition of the Diagnostic and Statistical Manual. In this paper the author attempts to evaluate the Manual, which purports to provide an atheoretical and descriptive diagnostic model on its own ground, as well as from a psychodynamic and ego-psychological point of view. More specifically, the Manual is evaluated in terms of its usefulness in the diagnosis and management of the Somatoform Disorders, as well as other typical general hospital problems. This assessment raises many questions about the basic tenets of DSM III in general. Concomitantly, suggestions are made to enhance the reliability, validity, and clinical relevance of the Manual.
The role of decision analytic modeling in the health economic assessment of spinal intervention.
Edwards, Natalie C; Skelly, Andrea C; Ziewacz, John E; Cahill, Kevin; McGirt, Matthew J
2014-10-15
Narrative review. To review the common tenets, strengths, and weaknesses of decision modeling for health economic assessment and to review the use of decision modeling in the spine literature to date. For the majority of spinal interventions, well-designed prospective, randomized, pragmatic cost-effectiveness studies that address the specific decision-in-need are lacking. Decision analytic modeling allows for the estimation of cost-effectiveness based on data available to date. Given the rising demands for proven value in spine care, the use of decision analytic modeling is rapidly increasing by clinicians and policy makers. This narrative review discusses the general components of decision analytic models, how decision analytic models are populated and the trade-offs entailed, makes recommendations for how users of spine intervention decision models might go about appraising the models, and presents an overview of published spine economic models. A proper, integrated, clinical, and economic critical appraisal is necessary in the evaluation of the strength of evidence provided by a modeling evaluation. As is the case with clinical research, all options for collecting health economic or value data are not without their limitations and flaws. There is substantial heterogeneity across the 20 spine intervention health economic modeling studies summarized with respect to study design, models used, reporting, and general quality. There is sparse evidence for populating spine intervention models. Results mostly showed that interventions were cost-effective based on $100,000/quality-adjusted life-year threshold. Spine care providers, as partners with their health economic colleagues, have unique clinical expertise and perspectives that are critical to interpret the strengths and weaknesses of health economic models. Health economic models must be critically appraised for both clinical validity and economic quality before altering health care policy, payment strategies, or patient care decisions. 4.
A collision model for safety evaluation of autonomous intelligent cruise control.
Touran, A; Brackstone, M A; McDonald, M
1999-09-01
This paper describes a general framework for safety evaluation of autonomous intelligent cruise control in rear-end collisions. Using data and specifications from prototype devices, two collision models are developed. One model considers a train of four cars, one of which is equipped with autonomous intelligent cruise control. This model considers the car in front and two cars following the equipped car. In the second model, none of the cars is equipped with the device. Each model can predict the possibility of rear-end collision between cars under various conditions by calculating the remaining distance between cars after the front car brakes. Comparing the two collision models allows one to evaluate the effectiveness of autonomous intelligent cruise control in preventing collisions. The models are then subjected to Monte Carlo simulation to calculate the probability of collision. Based on crash probabilities, an expected value is calculated for the number of cars involved in any collision. It is found that given the model assumptions, while equipping a car with autonomous intelligent cruise control can significantly reduce the probability of the collision with the car ahead, it may adversely affect the situation for the following cars.
UH-60A Black Hawk engineering simulation program. Volume 1: Mathematical model
NASA Technical Reports Server (NTRS)
Howlett, J. J.
1981-01-01
A nonlinear mathematical model of the UR-60A Black Hawk helicopter was developed. This mathematical model, which was based on the Sikorsky General Helicopter (Gen Hel) Flight Dynamics Simulation, provides NASA with an engineering simulation for performance and handling qualities evaluations. This mathematical model is total systems definition of the Black Hawk helicopter represented at a uniform level of sophistication considered necessary for handling qualities evaluations. The model is a total force, large angle representation in six rigid body degrees of freedom. Rotor blade flapping, lagging, and hub rotational degrees of freedom are also represented. In addition to the basic helicopter modules, supportive modules were defined for the landing interface, power unit, ground effects, and gust penetration. Information defining the cockpit environment relevant to pilot in the loop simulation is presented.
Johnson, Lenora; Ousley, Anita; Swarz, Jeffrey; Bingham, Raymond J; Erickson, J Bianca; Ellis, Steven; Moody, Terra
2011-03-01
Cancer education is a constantly evolving field, as science continues to advance both our understanding of cancer and its effects on patients, families, and communities. Moving discoveries to practice expeditiously is paramount to impacting cancer outcomes. The continuing education of cancer care professionals throughout their practice life is vital to facilitating the adoption of therapeutic innovations. Meanwhile, more general educational programs serve to keep cancer patients, their families, and the public informed of the latest findings in cancer research. The National Cancer Institute conducted an assessment of the current knowledge base for cancer education which involved two literature reviews, one of the general literature of the evaluation of medical and health education efforts, and the other of the preceding 5 years of the Journal of Cancer Education (JCE). These reviews explored a wide range of educational models and methodologies. In general, those that were most effective used multiple methodologies, interactive techniques, and multiple exposures over time. Less than one third of the articles in the JCE reported on a cancer education or communication product, and of these, only 70% had been evaluated for effectiveness. Recommendations to improve the evaluation of cancer education and the educational focus of the JCE are provided.
An application of a two-equation model of turbulence to three-dimensional chemically reacting flows
NASA Technical Reports Server (NTRS)
Lee, J.
1994-01-01
A numerical study of three dimensional chemically reacting and non-reacting flowfields is conducted using a two-equation model of turbulence. A generalized flow solver using an implicit Lower-Upper (LU) diagonal decomposition numerical technique and finite-rate chemistry has been coupled with a low-Reynolds number two-equation model of turbulence. This flow solver is then used to study chemically reacting turbulent supersonic flows inside combustors with synergetic fuel injectors. The reacting and non-reacting turbulent combustor solutions obtained are compared with zero-equation turbulence model solutions and with available experimental data. The hydrogen-air chemistry is modeled using a nine-species/eighteen reaction model. A low-Reynolds number k-epsilon model was used to model the effect of turbulence because, in general, the low-Reynolds number k-epsilon models are easier to implement numerically and are far more general than algebraic models. However, low-Reynolds number k-epsilon models require a much finer near-wall grid resolution than high-Reynolds number models to resolve accurately the near-wall physics. This is especially true in complex flowfields, where the stiff nature of the near-wall turbulence must be resolved. Therefore, the limitations imposed by the near-wall characteristics and compressible model corrections need to be evaluated further. The gradient-diffusion hypothesis is used to model the effects of turbulence on the mass diffusion process. The influence of this low-Reynolds number turbulence model on the reacting flowfield predictions was studied parametrically.
NASA Technical Reports Server (NTRS)
Burk, S. M., Jr.; Wilson, C. F., Jr.
1975-01-01
A relatively inexpensive radio-controlled model stall/spin test technique was developed. Operational experiences using the technique are presented. A discussion of model construction techniques, spin-recovery parachute system, data recording system, and movie camera tracking system is included. Also discussed are a method of measuring moments of inertia, scaling of engine thrust, cost and time required to conduct a program, and examples of the results obtained from the flight tests.
da SilvaFiorin, Fernando; do Espírito Santo, Caroline Cunha; Santos, Adair Roberto Soares; Fighera, Michele Rechia; Royes, Luiz Fernando Freire
2018-06-13
This study demonstrated the effects of traumatic brain injury (TBI) and each step of the surgical procedure for a fluid percussion injury (FPI) model on periorbital allodynia. Adult male Wistar rats were divided in naive, incision, scraping, sham-TBI and TBI groups. Periorbital allodynia was evaluated using von Frey filaments, and heat hyperalgesia of the hindpaws was evaluated by a Plantar Test Apparatus. The statistical analyses revealed that the surgical procedure decreased von Frey filaments thresholds twenty-four hours after the surgery in all groups when compared to the naive group (p < 0.0001). Scraping, sham-TBI and TBI groups showed a decrease in the periorbital mechanical threshold for 35 days compared with the naive and incision groups (p < 0.0001). Only the TBI group demonstrated a significant difference in periorbital allodynia at 45 and 60 days after the injury (p < 0.01). A significant decrease in the thermal withdrawal latency of the hindpaw contralateral to the lesion was observed in the TBI group compared with the naïve group at 7 days and 28 days after the lesion (p < 0.05). This study presented in detail the effects of each stage of the surgical procedure for a FPI model on periorbital allodynia over time and characterized the TBI model for this evaluation. The FPI model is relevant for the study of headache and generalized pain in both acute and chronic phases after an injury. Copyright © 2018 Elsevier B.V. All rights reserved.
Afshin Pourmokhtarian; Charles T. Driscoll; John L. Campbell; Katharine Hayhoe; Anne M. K. Stoner; Mary Beth Adams; Douglas Burns; Ivan Fernandez; Myron J. Mitchell; James B. Shanley
2016-01-01
A cross-site analysis was conducted on seven diverse, forested watersheds in the northeastern United States to evaluate hydrological responses (evapotranspiration, soil moisture, seasonal and annual streamflow, and water stress) to projections of future climate. We used output from four atmosphereâocean general circulation models (AOGCMs; CCSM4, HadGEM2-CC, MIROC5, and...
ERIC Educational Resources Information Center
Sideridis, Georgios D.
2016-01-01
The purpose of the present studies was to test the hypothesis that the psychometric characteristics of ability scales may be significantly distorted if one accounts for emotional factors during test taking. Specifically, the present studies evaluate the effects of anxiety and motivation on the item difficulties of the Rasch model. In Study 1, the…
2013-04-01
Forces can be computed at specific angular positions, and geometrical parameters can be evaluated. Much higher resolution models are required, along...composition engines (C#, C++, Python, Java ) Desert operates on the CyPhy model, converting from a design space alternative structure to a set of design...consists of scripts to execute dymola, post-processing of results to create metrics, and general management of the job sequence. An earlier version created
ERIC Educational Resources Information Center
Gulick, Thomas G.; Merkle, Melanie L.
An evaluation of the instructional materials used by high school and college students who participated in the Model United Nations Program showed that the program is uncritical of the United Nations (U.N.) and biased against the United States and the West in general. These materials are strongly promoted by many prominent educational professional…
ERIC Educational Resources Information Center
Truckenmiller, James L.
The former HEW National Strategy for Youth Development Model was a community-based planning and procedural tool designed to enhance positive youth development and prevent delinquency through a process of youth needs assessment, development of targeted programs, and program impact evaluation. A series of 12 Impact Scales most directly reflect the…
2010-12-18
grated for 20 years after initialization from rest and January temperature (T) and salinity (S) from the Generalized Digital Environmental Model ( GDEM ...coordinates, Ocean Modell., 37, 55–88. Carnes, M. R. (2009), Description and evaluation of GDEM ‐V 3.0, Tech. Rep. 724/NRL/MR/7300‐09‐9165, Nav. Res. Lab
Evaluation of a Mineral Dust Simulation in the Atmospheric-Chemistry General Circulation Model-EMAC
NASA Astrophysics Data System (ADS)
Abdel Kader, M.; Astitha, M.; Lelieveld, J.
2012-04-01
This study presents an evaluation of the atmospheric mineral dust cycle in the Atmospheric Chemistry General Circulation Model (AC-GCM) using new developed dust emissions scheme. The dust cycle, as an integral part of the Earth System, plays an important role in the Earth's energy balance by both direct and indirect ways. As an aerosol, it significantly impacts the absorption and scattering of radiation in the atmosphere and can modify the optical properties of clouds and snow/ice surfaces. In addition, dust contributes to a range of physical, chemical and bio-geological processes that interact with the cycles of carbon and water. While our knowledge of the dust cycle, its impacts and interactions with the other global-scale bio-geochemical cycles has greatly advanced in the last decades, large uncertainties and knowledge gaps still exist. Improving the dust simulation in global models is essential to minimize the uncertainties in the model results related to dust. In this study, the results are based on the ECHAM5 Modular Earth Submodel System (MESSy) AC-GCM simulations using T106L31 spectral resolution (about 120km ) with 31 vertical levels. The GMXe aerosol submodel is used to simulate the phase changes of the dust particles between soluble and insoluble modes. Dust emission, transport and deposition (wet and dry) are calculated on-line along with the meteorological parameters in every model time step. The preliminary evaluation of the dust concentration and deposition are presented based on ground observations from various campaigns as well as the evaluation of the optical properties of dust using AERONET and satellite (MODIS and MISR) observations. Preliminarily results show good agreement with observations for dust deposition and optical properties. In addition, the global dust emissions, load, deposition and lifetime is in good agreement with the published results. Also, the uncertainties in the dust cycle that contribute to the overall model performance will be briefly discussed as it is a subject of future work.
Teaching Computation/Shopping Skills to Mentally Retarded Adults.
ERIC Educational Resources Information Center
Matson, Johnny L.; Long, Sue
1986-01-01
Three moderately/mildly retarded adults were trained in adaptive community skills. Treatment involved instructions, performance feedback, social reinforcement, in-vivo modeling, self-evaluation, and social and tangible reinforcement. Rapid and dramatic improvements occurred soon after treatment began. Skills generalized to other shopping…
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1972-01-01
The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.
[Multi-center study of the Jenaer model of the temporal bone].
Schneider, G; Müller, A
2004-06-01
Preparing exercises at the temporal bone are a prerequisite for the knowledge of the anatomical special features of this region and for learning the fundamentals of the tympanic cavity surgery. Since however fewer human temporal bones are available, the search for back-up models already took place in the last years. Based on the experiences of the handling and visualization of CT data for the 3D-implant construction in the ent department Jena a temporal bone model was developed. The model was sent away to surgeons of different training. On the basis of identification of anatomical structures and evaluation of general parameters by means of a point system the model was evaluated. The Jenaer temporal bone model is suitable as entrance into the preparing exercises. The anatomical structures are good to identify for the beginner. The handling with drill and chisel can be learned.
Overview of the DAEDALOS project
NASA Astrophysics Data System (ADS)
Bisagni, Chiara
2015-10-01
The "Dynamics in Aircraft Engineering Design and Analysis for Light Optimized Structures" (DAEDALOS) project aimed to develop methods and procedures to determine dynamic loads by considering the effects of dynamic buckling, material damping and mechanical hysteresis during aircraft service. Advanced analysis and design principles were assessed with the scope of partly removing the uncertainty and the conservatism of today's design and certification procedures. To reach these objectives a DAEDALOS aircraft model representing a mid-size business jet was developed. Analysis and in-depth investigation of the dynamic response were carried out on full finite element models and on hybrid models. Material damping was experimentally evaluated, and different methods for damping evaluation were developed, implemented in finite element codes and experimentally validated. They include a strain energy method, a quasi-linear viscoelastic material model, and a generalized Maxwell viscous material damping. Panels and shells representative of typical components of the DAEDALOS aircraft model were experimentally tested subjected to static as well as dynamic loads. Composite and metallic components of the aircraft model were investigated to evaluate the benefit in terms of weight saving.
Structural Equation Modeling: A Framework for Ocular and Other Medical Sciences Research
Christ, Sharon L.; Lee, David J.; Lam, Byron L.; Diane, Zheng D.
2017-01-01
Structural equation modeling (SEM) is a modeling framework that encompasses many types of statistical models and can accommodate a variety of estimation and testing methods. SEM has been used primarily in social sciences but is increasingly used in epidemiology, public health, and the medical sciences. SEM provides many advantages for the analysis of survey and clinical data, including the ability to model latent constructs that may not be directly observable. Another major feature is simultaneous estimation of parameters in systems of equations that may include mediated relationships, correlated dependent variables, and in some instances feedback relationships. SEM allows for the specification of theoretically holistic models because multiple and varied relationships may be estimated together in the same model. SEM has recently expanded by adding generalized linear modeling capabilities that include the simultaneous estimation of parameters of different functional form for outcomes with different distributions in the same model. Therefore, mortality modeling and other relevant health outcomes may be evaluated. Random effects estimation using latent variables has been advanced in the SEM literature and software. In addition, SEM software has increased estimation options. Therefore, modern SEM is quite general and includes model types frequently used by health researchers, including generalized linear modeling, mixed effects linear modeling, and population average modeling. This article does not present any new information. It is meant as an introduction to SEM and its uses in ocular and other health research. PMID:24467557
Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.; Bonner, Andrew C.; Arevalo, Alexander R.
2017-01-01
Hagopian, Rooker, and Zarcone (2015) evaluated a model for subtyping automatically reinforced self-injurious behavior (SIB) based on its sensitivity to changes in functional analysis conditions and the presence of self-restraint. The current study tested the generality of the model by applying it to all datasets of automatically reinforced SIB published from 1982 to 2015. We identified 49 datasets that included sufficient data to permit subtyping. Similar to the original study, Subtype-1 SIB was generally amenable to treatment using reinforcement alone, whereas Subtype-2 SIB was not. Conclusions could not be drawn about Subtype-3 SIB due to the small number of datasets. Nevertheless, the findings support the generality of the model and suggest that sensitivity of SIB to disruption by alternative reinforcement is an important dimension of automatically reinforced SIB. Findings also suggest that automatically reinforced SIB should no longer be considered a single category and that additional research is needed to better understand and treat Subtype-2 SIB. PMID:28032344
Barton, Alan J; Valdés, Julio J; Orchard, Robert
2009-01-01
Classical neural networks are composed of neurons whose nature is determined by a certain function (the neuron model), usually pre-specified. In this paper, a type of neural network (NN-GP) is presented in which: (i) each neuron may have its own neuron model in the form of a general function, (ii) any layout (i.e network interconnection) is possible, and (iii) no bias nodes or weights are associated to the connections, neurons or layers. The general functions associated to a neuron are learned by searching a function space. They are not provided a priori, but are rather built as part of an Evolutionary Computation process based on Genetic Programming. The resulting network solutions are evaluated based on a fitness measure, which may, for example, be based on classification or regression errors. Two real-world examples are presented to illustrate the promising behaviour on classification problems via construction of a low-dimensional representation of a high-dimensional parameter space associated to the set of all network solutions.
Automatic specification of reliability models for fault-tolerant computers
NASA Technical Reports Server (NTRS)
Liceaga, Carlos A.; Siewiorek, Daniel P.
1993-01-01
The calculation of reliability measures using Markov models is required for life-critical processor-memory-switch structures that have standby redundancy or that are subject to transient or intermittent faults or repair. The task of specifying these models is tedious and prone to human error because of the large number of states and transitions required in any reasonable system. Therefore, model specification is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model specification. Automation requires a general system description language (SDL). For practicality, this SDL should also provide a high level of abstraction and be easy to learn and use. The first attempt to define and implement an SDL with those characteristics is presented. A program named Automated Reliability Modeling (ARM) was constructed as a research vehicle. The ARM program uses a graphical interface as its SDL, and it outputs a Markov reliability model specification formulated for direct use by programs that generate and evaluate the model.
Jin, H; Wu, S; Vidyanti, I; Di Capua, P; Wu, B
2015-01-01
This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". Depression is a common and often undiagnosed condition for patients with diabetes. It is also a condition that significantly impacts healthcare outcomes, use, and cost as well as elevating suicide risk. Therefore, a model to predict depression among diabetes patients is a promising and valuable tool for providers to proactively assess depressive symptoms and identify those with depression. This study seeks to develop a generalized multilevel regression model, using a longitudinal data set from a recent large-scale clinical trial, to predict depression severity and presence of major depression among patients with diabetes. Severity of depression was measured by the Patient Health Questionnaire PHQ-9 score. Predictors were selected from 29 candidate factors to develop a 2-level Poisson regression model that can make population-average predictions for all patients and subject-specific predictions for individual patients with historical records. Newly obtained patient records can be incorporated with historical records to update the prediction model. Root-mean-square errors (RMSE) were used to evaluate predictive accuracy of PHQ-9 scores. The study also evaluated the classification ability of using the predicted PHQ-9 scores to classify patients as having major depression. Two time-invariant and 10 time-varying predictors were selected for the model. Incorporating historical records and using them to update the model may improve both predictive accuracy of PHQ-9 scores and classification ability of the predicted scores. Subject-specific predictions (for individual patients with historical records) achieved RMSE about 4 and areas under the receiver operating characteristic (ROC) curve about 0.9 and are better than population-average predictions. The study developed a generalized multilevel regression model to predict depression and demonstrated that using generalized multilevel regression based on longitudinal patient records can achieve high predictive ability.
A habitat suitability model for Chinese sturgeon determined using the generalized additive method
NASA Astrophysics Data System (ADS)
Yi, Yujun; Sun, Jie; Zhang, Shanghong
2016-03-01
The Chinese sturgeon is a type of large anadromous fish that migrates between the ocean and rivers. Because of the construction of dams, this sturgeon's migration path has been cut off, and this species currently is on the verge of extinction. Simulating suitable environmental conditions for spawning followed by repairing or rebuilding its spawning grounds are effective ways to protect this species. Various habitat suitability models based on expert knowledge have been used to evaluate the suitability of spawning habitat. In this study, a two-dimensional hydraulic simulation is used to inform a habitat suitability model based on the generalized additive method (GAM). The GAM is based on real data. The values of water depth and velocity are calculated first via the hydrodynamic model and later applied in the GAM. The final habitat suitability model is validated using the catch per unit effort (CPUEd) data of 1999 and 2003. The model results show that a velocity of 1.06-1.56 m/s and a depth of 13.33-20.33 m are highly suitable ranges for the Chinese sturgeon to spawn. The hydraulic habitat suitability indexes (HHSI) for seven discharges (4000; 9000; 12,000; 16,000; 20,000; 30,000; and 40,000 m3/s) are calculated to evaluate integrated habitat suitability. The results show that the integrated habitat suitability reaches its highest value at a discharge of 16,000 m3/s. This study is the first to apply a GAM to evaluate the suitability of spawning grounds for the Chinese sturgeon. The study provides a reference for the identification of potential spawning grounds in the entire basin.
Properties of added variable plots in Cox's regression model.
Lindkvist, M
2000-03-01
The added variable plot is useful for examining the effect of a covariate in regression models. The plot provides information regarding the inclusion of a covariate, and is useful in identifying influential observations on the parameter estimates. Hall et al. (1996) proposed a plot for Cox's proportional hazards model derived by regarding the Cox model as a generalized linear model. This paper proves and discusses properties of this plot. These properties make the plot a valuable tool in model evaluation. Quantities considered include parameter estimates, residuals, leverage, case influence measures and correspondence to previously proposed residuals and diagnostics.
Evaluation of two models for predicting elemental accumulation by arthropods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webster, J.R.; Crossley, D.A. Jr.
1978-06-15
Two different models have been proposed for predicting elemental accumulation by arthropods. Parameters of both models can be quantified from radioisotope elimination experiments. Our analysis of the 2 models shows that both predict identical elemental accumulation for a whole organism, though differing in the accumulation in body and gut. We quantified both models with experimental data from /sup 134/Cs and /sup 85/Sr elimination by crickets. Computer simulations of radioisotope accumulation were then compared with actual accumulation experiments. Neither model showed exact fit to the experimental data, though both showed the general pattern of elemental accumulation.
Design and test of aircraft engine isolators for reduced interior noise
NASA Technical Reports Server (NTRS)
Unruh, J. F.; Scheidt, D. C.
1982-01-01
Improved engine vibration isolation was proposed to be the most weight and cost efficient retrofit structure-borne noise control measure for single engine general aviation aircraft. A study was carried out the objectives: (1) to develop an engine isolator design specification for reduced interior noise transmission, (2) select/design candidate isolators to meet a 15 dB noise reduction design goal, and (3) carry out a proof of concept evaluation test. Analytical model of the engine, vibration isolators and engine mount structure were coupled to an empirical model of the fuselage for noise transmission evaluation. The model was used to develop engine isolator dynamic properties design specification for reduced noise transmission. Candidate isolators ere chosen from available product literature and retrofit to a test aircraft. A laboratory based test procedure was then developed to simulate engine induced noise transmission in the aircraft for a proof of concept evaluation test. Three candidate isolator configurations were evaluated for reduced structure-borne noise transmission relative to the original equipment isolators.
Djukanovic, Ingrid; Carlsson, Jörg; Årestedt, Kristofer
2017-10-04
The HADS (Hospital Anxiety and Depression Scale) aims to measure symptoms of anxiety (HADS Anxiety) and depression (HADS Depression). The HADS is widely used but has shown ambiguous results both regarding the factor structure and sex differences in the prevalence of depressive symptoms. There is also a lack of psychometric evaluations of the HADS in non-clinical samples of older people. The aim of the study was to evaluate the factor structure of the HADS in a general population 65-80 years old and to exam possible presence of differential item functioning (DIF) with respect to sex. This study was based on data from a Swedish sample, randomized from the total population in the age group 65-80 years (n = 6659). Confirmatory factor analyses (CFA) were performed to examine the factor structure. Ordinal regression analyses were conducted to detect DIF for sex. Reliability was examined by both ordinal as well as traditional Cronbach's alpha. The CFA showed a two-factor model with cross-loadings for two items (7 and 8) had excellent model fit. Internal consistency was good in both subscales, measured with ordinal and traditional alpha. Floor effects were presented for all items. No indication for meaningful DIF regarding sex was found for any of the subscales. HADS Anxiety and HADS Depression are unidimensional measures with acceptable internal consistency and are invariant with regard to sex. Despite pronounced ceiling effects and cross-loadings for item 7 and 8, the hypothesized two-factor model of HADS can be recommended to assess psychological distress among a general population 65-80 years old.
Innovation evaluation model for macro-construction sector companies: A study in Spain.
Zubizarreta, Mikel; Cuadrado, Jesús; Iradi, Jon; García, Harkaitz; Orbe, Aimar
2017-04-01
The innovativeness of the traditional construction sector, composed of construction companies or contractors, is not one of its strong points. Likewise, its poor productivity in comparison with other sectors, such as manufacturing, has historically been criticized. Similar features are found in the Spanish traditional construction sector, which it has been described as not very innovative. However, certain characteristics of the sector may explain this behavior; the companies invest in R+D less than in other sectors and release fewer patents, so traditional innovation evaluation indicators do not reflect the true extent of its innovative activity. While previous research has focused on general innovation evaluation models, limited research has been done regarding innovation evaluation in the macro-construction sector, which includes, apart from the traditional construction companies or contractors, all companies related to the infrastructure life-cycle. Therefore, in this research an innovation evaluation model has been developed for macro-construction sector companies and is applied in the Spanish case. The model may be applied to the macro-construction sector companies in other countries, requiring the adaption of the model to the specific characteristics of the sector in that country, in consultation with a panel of experts at a national level. Copyright © 2016 Elsevier Ltd. All rights reserved.
On the Formulation of Anisotropic-Polyaxial Failure Criteria: A Comparative Study
NASA Astrophysics Data System (ADS)
Parisio, Francesco; Laloui, Lyesse
2018-02-01
The correct representation of the failure of geomaterials that feature strength anisotropy and polyaxiality is crucial for many applications. In this contribution, we propose and evaluate through a comparative study a generalized framework that covers both features. Polyaxiality of strength is modeled with a modified Van Eekelen approach, while the anisotropy is modeled using a fabric tensor approach of the Pietruszczak and Mroz type. Both approaches share the same philosophy as they can be applied to simpler failure surfaces, allowing great flexibility in model formulation. The new failure surface is tested against experimental data and its performance compared against classical failure criteria commonly used in geomechanics. Our study finds that the global error between predictions and data is generally smaller for the proposed framework compared to other classical approaches.
Hare, Jonathan A.; Wuenschel, Mark J.; Kimball, Matthew E.
2012-01-01
We couple a species range limit hypothesis with the output of an ensemble of general circulation models to project the poleward range limit of gray snapper. Using laboratory-derived thermal limits and statistical downscaling from IPCC AR4 general circulation models, we project that gray snapper will shift northwards; the magnitude of this shift is dependent on the magnitude of climate change. We also evaluate the uncertainty in our projection and find that statistical uncertainty associated with the experimentally-derived thermal limits is the largest contributor (∼ 65%) to overall quantified uncertainty. This finding argues for more experimental work aimed at understanding and parameterizing the effects of climate change and variability on marine species. PMID:23284974
A Complex Systems Model Approach to Quantified Mineral Resource Appraisal
Gettings, M.E.; Bultman, M.W.; Fisher, F.S.
2004-01-01
For federal and state land management agencies, mineral resource appraisal has evolved from value-based to outcome-based procedures wherein the consequences of resource development are compared with those of other management options. Complex systems modeling is proposed as a general framework in which to build models that can evaluate outcomes. Three frequently used methods of mineral resource appraisal (subjective probabilistic estimates, weights of evidence modeling, and fuzzy logic modeling) are discussed to obtain insight into methods of incorporating complexity into mineral resource appraisal models. Fuzzy logic and weights of evidence are most easily utilized in complex systems models. A fundamental product of new appraisals is the production of reusable, accessible databases and methodologies so that appraisals can easily be repeated with new or refined data. The data are representations of complex systems and must be so regarded if all of their information content is to be utilized. The proposed generalized model framework is applicable to mineral assessment and other geoscience problems. We begin with a (fuzzy) cognitive map using (+1,0,-1) values for the links and evaluate the map for various scenarios to obtain a ranking of the importance of various links. Fieldwork and modeling studies identify important links and help identify unanticipated links. Next, the links are given membership functions in accordance with the data. Finally, processes are associated with the links; ideally, the controlling physical and chemical events and equations are found for each link. After calibration and testing, this complex systems model is used for predictions under various scenarios.
NASA Astrophysics Data System (ADS)
Demuzere, Matthias; Harshan, Suraj; Järvi, Leena; Roth, Matthias; Betham Grimmond, Christine Susan; Masson, Valéry; Oleson, Keith; Velasco Saldana, Hector Erik; Wouters, Hendrik
2017-04-01
This paper provides the first comparative evaluation of four urban land surface models for a tropical residential neighbourhood in Singapore. The simulations are performed offline, for an 11-month period, using the bulk scheme TERRA_URB and three models of intermediate complexity (CLM, SURFEX and SUEWS). In addition, information from three different parameter lists are added to quantify the impact (interaction) of (between) external parameter settings and model formulations on the modelled urban energy balance components. Overall, the models' performance using the reference parameters aligns well with previous findings for mid- and high-latitude sites against (for) which the models are generally optimised (evaluated). The various combinations of models and different parameter values suggest that error statistics tend to be more dominated by the choice of the latter than the choice of model. Stratifying the observation period into dry / wet periods and hours since selected precipitation events reveals that the models' skill generally deteriorates during dry periods while e.g. CLM/SURFEX has a positive bias in the latent heat flux directly after a precipitation event. It is shown that the latter is due to simple representation of water intercepted on the impervious surfaces. In addition, the positive bias in modelled outgoing longwave radiation is attributed to neglecting the interactions between water vapor and radiation between the surface and the tower sensor. These findings suggest that future developments in urban climate research should continue the integration of more physically-based processes in urban canopy models, ensure the consistency between the observed and modelled atmospheric properties and focus on the correct representation of urban morphology and thermal and radiative characteristics.
Hill, Mary C.; L. Foglia,; S. W. Mehl,; P. Burlando,
2013-01-01
Model adequacy is evaluated with alternative models rated using model selection criteria (AICc, BIC, and KIC) and three other statistics. Model selection criteria are tested with cross-validation experiments and insights for using alternative models to evaluate model structural adequacy are provided. The study is conducted using the computer codes UCODE_2005 and MMA (MultiModel Analysis). One recharge alternative is simulated using the TOPKAPI hydrological model. The predictions evaluated include eight heads and three flows located where ecological consequences and model precision are of concern. Cross-validation is used to obtain measures of prediction accuracy. Sixty-four models were designed deterministically and differ in representation of river, recharge, bedrock topography, and hydraulic conductivity. Results include: (1) What may seem like inconsequential choices in model construction may be important to predictions. Analysis of predictions from alternative models is advised. (2) None of the model selection criteria consistently identified models with more accurate predictions. This is a disturbing result that suggests to reconsider the utility of model selection criteria, and/or the cross-validation measures used in this work to measure model accuracy. (3) KIC displayed poor performance for the present regression problems; theoretical considerations suggest that difficulties are associated with wide variations in the sensitivity term of KIC resulting from the models being nonlinear and the problems being ill-posed due to parameter correlations and insensitivity. The other criteria performed somewhat better, and similarly to each other. (4) Quantities with high leverage are more difficult to predict. The results are expected to be generally applicable to models of environmental systems.
Beatty, William; Jay, Chadwick V.; Fischbach, Anthony S.
2016-01-01
State-space models offer researchers an objective approach to modeling complex animal location data sets, and state-space model behavior classifications are often assumed to have a link to animal behavior. In this study, we evaluated the behavioral classification accuracy of a Bayesian state-space model in Pacific walruses using Argos satellite tags with sensors to detect animal behavior in real time. We fit a two-state discrete-time continuous-space Bayesian state-space model to data from 306 Pacific walruses tagged in the Chukchi Sea. We matched predicted locations and behaviors from the state-space model (resident, transient behavior) to true animal behavior (foraging, swimming, hauled out) and evaluated classification accuracy with kappa statistics (κ) and root mean square error (RMSE). In addition, we compared biased random bridge utilization distributions generated with resident behavior locations to true foraging behavior locations to evaluate differences in space use patterns. Results indicated that the two-state model fairly classified true animal behavior (0.06 ≤ κ ≤ 0.26, 0.49 ≤ RMSE ≤ 0.59). Kernel overlap metrics indicated utilization distributions generated with resident behavior locations were generally smaller than utilization distributions generated with true foraging behavior locations. Consequently, we encourage researchers to carefully examine parameters and priors associated with behaviors in state-space models, and reconcile these parameters with the study species and its expected behaviors.
Metadynamics for training neural network model chemistries: A competitive assessment
NASA Astrophysics Data System (ADS)
Herr, John E.; Yao, Kun; McIntyre, Ryker; Toth, David W.; Parkhill, John
2018-06-01
Neural network model chemistries (NNMCs) promise to facilitate the accurate exploration of chemical space and simulation of large reactive systems. One important path to improving these models is to add layers of physical detail, especially long-range forces. At short range, however, these models are data driven and data limited. Little is systematically known about how data should be sampled, and "test data" chosen randomly from some sampling techniques can provide poor information about generality. If the sampling method is narrow, "test error" can appear encouragingly tiny while the model fails catastrophically elsewhere. In this manuscript, we competitively evaluate two common sampling methods: molecular dynamics (MD), normal-mode sampling, and one uncommon alternative, Metadynamics (MetaMD), for preparing training geometries. We show that MD is an inefficient sampling method in the sense that additional samples do not improve generality. We also show that MetaMD is easily implemented in any NNMC software package with cost that scales linearly with the number of atoms in a sample molecule. MetaMD is a black-box way to ensure samples always reach out to new regions of chemical space, while remaining relevant to chemistry near kbT. It is a cheap tool to address the issue of generalization.
Static and Vibration Analyses of General Wing Structures Using Equivalent Plate Models
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Liu, Youhua
1999-01-01
An efficient method, using equivalent plate model, is developed for studying the static and vibration analyses of general built-up wing structures composed of skins, spars, and ribs. The model includes the transverse shear effects by treating the built-up wing as a plate following the Reissner-Mindlin theory, the so-called First-order Shear Deformation Theory (FSDT). The Ritz method is used with the Legendre polynomials being employed as the trial functions. This is in contrast to previous equivalent plate model methods which have used simple polynomials, known to be prone to numerical ill-conditioning, as the trial functions. The present developments are evaluated by comparing the results with those obtained using MSC/NASTRAN, for a set of examples. These examples are: (i) free-vibration analysis of a clamped trapezoidal plate with (a) uniform thickness, and (b) non-uniform thickness varying as an airfoil, (ii) free-vibration and static analyses (including skin stress distribution) of a general built-up wing, and (iii) free-vibration and static analyses of a swept-back box wing. The results obtained by the present equivalent plate model are in good agreement with those obtained by the finite element method.
Ye, Yu; Kerr, William C
2011-01-01
To explore various model specifications in estimating relationships between liver cirrhosis mortality rates and per capita alcohol consumption in aggregate-level cross-section time-series data. Using a series of liver cirrhosis mortality rates from 1950 to 2002 for 47 U.S. states, the effects of alcohol consumption were estimated from pooled autoregressive integrated moving average (ARIMA) models and 4 types of panel data models: generalized estimating equation, generalized least square, fixed effect, and multilevel models. Various specifications of error term structure under each type of model were also examined. Different approaches controlling for time trends and for using concurrent or accumulated consumption as predictors were also evaluated. When cirrhosis mortality was predicted by total alcohol, highly consistent estimates were found between ARIMA and panel data analyses, with an average overall effect of 0.07 to 0.09. Less consistent estimates were derived using spirits, beer, and wine consumption as predictors. When multiple geographic time series are combined as panel data, none of existent models could accommodate all sources of heterogeneity such that any type of panel model must employ some form of generalization. Different types of panel data models should thus be estimated to examine the robustness of findings. We also suggest cautious interpretation when beverage-specific volumes are used as predictors. Copyright © 2010 by the Research Society on Alcoholism.
Halcomb, Elizabeth J; Furler, John S; Hermiz, Oshana S; Blackberry, Irene D; Smith, Julie P; Richmond, Robyn L; Zwar, Nicholas A
2015-08-01
Support in primary care can assist smokers to quit successfully, but there are barriers to general practitioners (GPs) providing this support routinely. Practice nurses (PNs) may be able to effectively take on this role. The aim of this study was to perform a process evaluation of a PN-led smoking cessation intervention being tested in a randomized controlled trial in Australian general practice. Process evaluation was conducted by means of semi-structured telephone interviews with GPs and PNs allocated in the intervention arm (Quit with PN) of the Quit in General Practice trial. Interviews focussed on nurse training, content and implementation of the intervention. Twenty-two PNs and 15 GPs participated in the interviews. The Quit with PN intervention was viewed positively. Most PNs were satisfied with the training and the materials provided. Some challenges in managing patient data and follow-up were identified. The Quit with PN intervention was acceptable to participating PNs and GPs. Issues to be addressed in the planning and wider implementation of future trials of nurse-led intervention in general practice include providing ongoing mentoring support, integration into practice management systems and strategies to promote greater collaboration in GPs and PN teams in general practice. The ongoing feasibility of the intervention was impacted by the funding model supporting PN employment and the competing demands on the PNs time. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Peng, Jiaxi; Li, Dongdong; Zhang, Zhenjiang; Tian, Yu; Miao, Danmin; Xiao, Wei; Zhang, Jiaxi
2016-01-01
This study aimed to explore how core self-evaluations influenced job burnout and mainly focused on the confirmation of the mediator roles of organizational commitment and job satisfaction. A total of 583 female nurses accomplished the Core Self-Evaluation Scale, Organizational Commitment Scale, Minnesota Satisfaction Questionnaire, and Maslach Burnout Inventory-General Survey. The results revealed that core self-evaluations, organizational commitment, job satisfaction, and job burnout were significantly correlated with each other. Structural equation modeling indicated that core self-evaluations can significantly influence job burnout and are completely mediated by organizational commitment and job satisfaction. © The Author(s) 2014.
Henry, Julie D; Crawford, John R
2005-06-01
To test the construct validity of the short-form version of the Depression anxiety and stress scale (DASS-21), and in particular, to assess whether stress as indexed by this measure is synonymous with negative affectivity (NA) or whether it represents a related, but distinct, construct. To provide normative data for the general adult population. Cross-sectional, correlational and confirmatory factor analysis (CFA). The DASS-21 was administered to a non-clinical sample, broadly representative of the general adult UK population (N = 1,794). Competing models of the latent structure of the DASS-21 were evaluated using CFA. The model with optimal fit (RCFI = 0.94) had a quadripartite structure, and consisted of a general factor of psychological distress plus orthogonal specific factors of depression, anxiety, and stress. This model was a significantly better fit than a competing model that tested the possibility that the Stress scale simply measures NA. The DASS-21 subscales can validly be used to measure the dimensions of depression, anxiety, and stress. However, each of these subscales also taps a more general dimension of psychological distress or NA. The utility of the measure is enhanced by the provision of normative data based on a large sample.
Hurricane Intensity Forecasts with a Global Mesoscale Model on the NASA Columbia Supercomputer
NASA Technical Reports Server (NTRS)
Shen, Bo-Wen; Tao, Wei-Kuo; Atlas, Robert
2006-01-01
It is known that General Circulation Models (GCMs) have insufficient resolution to accurately simulate hurricane near-eye structure and intensity. The increasing capabilities of high-end computers (e.g., the NASA Columbia Supercomputer) have changed this. In 2004, the finite-volume General Circulation Model at a 1/4 degree resolution, doubling the resolution used by most of operational NWP center at that time, was implemented and run to obtain promising landfall predictions for major hurricanes (e.g., Charley, Frances, Ivan, and Jeanne). In 2005, we have successfully implemented the 1/8 degree version, and demonstrated its performance on intensity forecasts with hurricane Katrina (2005). It is found that the 1/8 degree model is capable of simulating the radius of maximum wind and near-eye wind structure, and thereby promising intensity forecasts. In this study, we will further evaluate the model s performance on intensity forecasts of hurricanes Ivan, Jeanne, Karl in 2004. Suggestions for further model development will be made in the end.
NASA Astrophysics Data System (ADS)
Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong
2018-05-01
In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.
Identification of aerodynamic models for maneuvering aircraft
NASA Technical Reports Server (NTRS)
Lan, C. Edward; Hu, C. C.
1992-01-01
The method based on Fourier functional analysis and indicial formulation for aerodynamic modeling as proposed by Chin and Lan is extensively examined and improved for the purpose of general applications to realistic airplane configurations. Improvement is made to automate the calculation of model coefficients, and to evaluate more accurately the indicial integral. Test data of large angle-of-attack ranges for two different models, a 70 deg. delta wing and an F-18 model, are used to further verify the applicability of Fourier functional analysis and validate the indicial formulation. The results show that the general expression for harmonic motions throughout a range of k is capable of accurately modeling the nonlinear responses with large phase lag except in the region where an inconsistent hysteresis behavior from one frequency to the other occurs. The results by the indicial formulation indicate that more accurate results can be obtained when the motion starts from a low angle of attack where hysteresis effect is not important.
Competing regression models for longitudinal data.
Alencar, Airlane P; Singer, Julio M; Rocha, Francisco Marcelo M
2012-03-01
The choice of an appropriate family of linear models for the analysis of longitudinal data is often a matter of concern for practitioners. To attenuate such difficulties, we discuss some issues that emerge when analyzing this type of data via a practical example involving pretest-posttest longitudinal data. In particular, we consider log-normal linear mixed models (LNLMM), generalized linear mixed models (GLMM), and models based on generalized estimating equations (GEE). We show how some special features of the data, like a nonconstant coefficient of variation, may be handled in the three approaches and evaluate their performance with respect to the magnitude of standard errors of interpretable and comparable parameters. We also show how different diagnostic tools may be employed to identify outliers and comment on available software. We conclude by noting that the results are similar, but that GEE-based models may be preferable when the goal is to compare the marginal expected responses. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu
2013-04-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.
Evaluation of physical activity web sites for use of behavior change theories.
Doshi, Amol; Patrick, Kevin; Sallis, James F; Calfas, Karen
2003-01-01
Physical activity (PA) Web sites were assessed for their use of behavior change theories, including constructs of the health belief model, Transtheoretical Model, social cognitive theory, and the theory of reasoned action and planned behavior. An evaluation template for assessing PA Web sites was developed, and content validity and interrater reliability were demonstrated. Two independent raters evaluated 24 PA Web sites. Web sites varied widely in application of theory-based constructs, ranging from 5 to 48 on a 100-point scale. The most common intervention strategies were general information, social support, and realistic goal areas. Coverage of theory-based strategies was low, varying from 26% for social cognitive theory to 39% for health belief model. Overall, PA Web sites provided little assessment, feedback, or individually tailored assistance for users. They were unable to substantially tailor the on-line experience for users at different stages of change or different demographic characteristics.
Slater, Michael D
2006-01-01
While increasingly widespread use of behavior change theory is an advance for communication campaigns and their evaluation, such theories provide a necessary but not sufficient condition for theory-based communication interventions. Such interventions and their evaluations need to incorporate theoretical thinking about plausible mechanisms of message effect on health-related attitudes and behavior. Otherwise, strategic errors in message design and dissemination, and misspecified campaign logic models, insensitive to campaign effects, are likely to result. Implications of the elaboration likelihood model, attitude accessibility, attitude to the ad theory, exemplification, and framing are explored, and implications for campaign strategy and evaluation designs are briefly discussed. Initial propositions are advanced regarding a theory of campaign affect generalization derived from attitude to ad theory, and regarding a theory of reframing targeted health behaviors in those difficult contexts in which intended audiences are resistant to the advocated behavior or message.