ERIC Educational Resources Information Center
Çaliskan, Ilke
2014-01-01
The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…
ERIC Educational Resources Information Center
Çaliskan, Ilke
2014-01-01
The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
Modelling seagrass growth and development to evaluate transplanting strategies for restoration.
Renton, Michael; Airey, Michael; Cambridge, Marion L; Kendrick, Gary A
2011-10-01
Seagrasses are important marine plants that are under threat globally. Restoration by transplanting vegetative fragments or seedlings into areas where seagrasses have been lost is possible, but long-term trial data are limited. The goal of this study is to use available short-term data to predict long-term outcomes of transplanting seagrass. A functional-structural plant model of seagrass growth that integrates data collected from short-term trials and experiments is presented. The model was parameterized for the species Posidonia australis, a limited validation of the model against independent data and a sensitivity analysis were conducted and the model was used to conduct a preliminary evaluation of different transplanting strategies. The limited validation was successful, and reasonable long-term outcomes could be predicted, based only on short-term data. This approach for modelling seagrass growth and development enables long-term predictions of the outcomes to be made from different strategies for transplanting seagrass, even when empirical long-term data are difficult or impossible to collect. More validation is required to improve confidence in the model's predictions, and inclusion of more mechanism will extend the model's usefulness. Marine restoration represents a novel application of functional-structural plant modelling.
Evaluation of a proposed method for representing drug terminology.
Cimino, J. J.; McNamara, T. J.; Meredith, T.; Broverman, C. A.; Eckert, K. C.; Moore, M.; Tyree, D. J.
1999-01-01
In the absence of a single, standard, multipurpose terminology for representing medications, the HL7 Vocabulary Technical Committee has sought to develop a model for such terms in a way that will provide a unified method for representing them and supporting interoperability among various terminology systems. We evaluated the preliminary model by obtaining terms, represented in our model, from three leading vendors of pharmacy system knowledge bases. A total of 2303 terms were obtained, and 3982 pair-wise comparisons were possible. We found that the components of the term descriptions matched 68-87% of the time and that the overall descriptions matched 53% of the time. The evaluation has identified a number of areas in the model where more rigorous definitions will be needed in order to improve the matching rate. This paper discusses the implications of these results. PMID:10566318
Durrett, Christine; Trull, Timothy J
2005-09-01
Two personality models are compared regarding their relationship with personality disorder (PD) symptom counts and with lifetime Axis I diagnoses. These models share 5 similar domains, and the Big 7 model also includes 2 domains assessing self-evaluation: positive and negative valence. The Big 7 model accounted for more variance in PDs than the 5-factor model, primarily because of the association of negative valence with most PDs. Although low-positive valence was associated with most Axis I diagnoses, the 5-factor model generally accounted for more variance in Axis I diagnoses than the Big 7 model. Some predicted associations between self-evaluation and psychopathology were not found, and unanticipated associations emerged. These findings are discussed regarding the utility of evaluative terms in clinical assessment.
Modelling seagrass growth and development to evaluate transplanting strategies for restoration
Renton, Michael; Airey, Michael; Cambridge, Marion L.; Kendrick, Gary A.
2011-01-01
Background and Aims Seagrasses are important marine plants that are under threat globally. Restoration by transplanting vegetative fragments or seedlings into areas where seagrasses have been lost is possible, but long-term trial data are limited. The goal of this study is to use available short-term data to predict long-term outcomes of transplanting seagrass. Methods A functional–structural plant model of seagrass growth that integrates data collected from short-term trials and experiments is presented. The model was parameterized for the species Posidonia australis, a limited validation of the model against independent data and a sensitivity analysis were conducted and the model was used to conduct a preliminary evaluation of different transplanting strategies. Key Results The limited validation was successful, and reasonable long-term outcomes could be predicted, based only on short-term data. Conclusions This approach for modelling seagrass growth and development enables long-term predictions of the outcomes to be made from different strategies for transplanting seagrass, even when empirical long-term data are difficult or impossible to collect. More validation is required to improve confidence in the model's predictions, and inclusion of more mechanism will extend the model's usefulness. Marine restoration represents a novel application of functional–structural plant modelling. PMID:21821624
A routinely applied atmospheric dispersion model was modified to evaluate alternative modeling techniques which allowed for more detailed source data, onsite meteorological data, and several dispersion methodologies. These were evaluated with hourly SO2 concentrations measured at...
Models and Mechanisms for Evaluating Government-Funded Research: An International Comparison
ERIC Educational Resources Information Center
Coryn, Chris L. S.; Hattie, John A.; Scriven, Michael; Hartmann, David J.
2007-01-01
This research describes, classifies, and comparatively evaluates national models and mechanisms used to evaluate research and allocate research funding in 16 countries. Although these models and mechanisms vary widely in terms of how research is evaluated and financed, nearly all share the common characteristic of relating funding to some measure…
Evaluating the Value of High Spatial Resolution in National Capacity Expansion Models using ReEDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Venkat; Cole, Wesley
2016-11-14
Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions--native resolution (134 BAs), state-level, and NERCmore » region level--and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.« less
Evaluation of Regression Models of Balance Calibration Data Using an Empirical Criterion
NASA Technical Reports Server (NTRS)
Ulbrich, Norbert; Volden, Thomas R.
2012-01-01
An empirical criterion for assessing the significance of individual terms of regression models of wind tunnel strain gage balance outputs is evaluated. The criterion is based on the percent contribution of a regression model term. It considers a term to be significant if its percent contribution exceeds the empirical threshold of 0.05%. The criterion has the advantage that it can easily be computed using the regression coefficients of the gage outputs and the load capacities of the balance. First, a definition of the empirical criterion is provided. Then, it is compared with an alternate statistical criterion that is widely used in regression analysis. Finally, calibration data sets from a variety of balances are used to illustrate the connection between the empirical and the statistical criterion. A review of these results indicated that the empirical criterion seems to be suitable for a crude assessment of the significance of a regression model term as the boundary between a significant and an insignificant term cannot be defined very well. Therefore, regression model term reduction should only be performed by using the more universally applicable statistical criterion.
Friction-term response to boundary-condition type in flow models
Schaffranek, R.W.; Lai, C.
1996-01-01
The friction-slope term in the unsteady open-channel flow equations is examined using two numerical models based on different formulations of the governing equations and employing different solution methods. The purposes of the study are to analyze, evaluate, and demonstrate the behavior of the term in a set of controlled numerical experiments using varied types and combinations of boundary conditions. Results of numerical experiments illustrate that a given model can respond inconsistently for the identical resistance-coefficient value under different types and combinations of boundary conditions. Findings also demonstrate that two models employing different dependent variables and solution methods can respond similarly for the identical resistance-coefficient value under similar types and combinations of boundary conditions. Discussion of qualitative considerations and quantitative experimental results provides insight into the proper treatment, evaluation, and significance of the friction-slope term, thereby offering practical guidelines for model implementation and calibration.
Dynamic Evaluation of Long-Term Air Quality Model Simulations Over the Northeastern U.S.
Dynamic model evaluation assesses a modeling system's ability to reproduce changes in air quality induced by changes in meteorology and/or emissions. In this paper, we illustrate various approaches to dynamic mode evaluation utilizing 18 years of air quality simulations perform...
Johansson, Michael A; Reich, Nicholas G; Hota, Aditi; Brownstein, John S; Santillana, Mauricio
2016-09-26
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model.
Johansson, Michael A.; Reich, Nicholas G.; Hota, Aditi; Brownstein, John S.; Santillana, Mauricio
2016-01-01
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model. PMID:27665707
Evaluation and prediction of long-term environmental effects on nonmetallic materials
NASA Technical Reports Server (NTRS)
1982-01-01
Changes in functional properties of a broad spectrum of nonmetallic materials as a function of environment and exposure time were evaluated. Models for predicting long-term material performance are discussed. A literature search on specific materials in the space and simulated space environment was carried out and evaluated.
Evaluation of infiltration models in contaminated landscape.
Sadegh Zadeh, Kouroush; Shirmohammadi, Adel; Montas, Hubert J; Felton, Gary
2007-06-01
The infiltration models of Kostiakov, Green-Ampt, and Philip (two and three terms equations) were used, calibrated, and evaluated to simulate in-situ infiltration in nine different soil types. The Osborne-Moré modified version of the Levenberg-Marquardt optimization algorithm was coupled with the experimental data obtained by the double ring infiltrometers and the infiltration equations, to estimate the model parameters. Comparison of the model outputs with the experimental data indicates that the models can successfully describe cumulative infiltration in different soil types. However, since Kostiakov's equation fails to accurately simulate the infiltration rate as time approaches infinity, Philip's two-term equation, in some cases, produces negative values for the saturated hydraulic conductivity of soils, and the Green-Ampt model uses piston flow assumptions, we suggest using Philip's three-term equation to simulate infiltration and to estimate the saturated hydraulic conductivity of soils.
Stein, Karen
2016-01-01
This commentary discusses the need to evaluate the impact of World Elder Abuse Awareness Day activities, the elder abuse field's most sustained public awareness initiative. A logic model is proposed with measures for short-term, medium-term, and long-term outcomes for community-based programs.
Kim, Jae-Hong; Kim, Ki-Baek; Kim, Woong-Chul; Kim, Ji-Hwan; Kim, Hae-Young
2014-03-01
This study aimed to evaluate the accuracy and precision of polyurethane (PUT) dental arch models fabricated using a three-dimensional (3D) subtractive rapid prototyping (RP) method with an intraoral scanning technique by comparing linear measurements obtained from PUT models and conventional plaster models. Ten plaster models were duplicated using a selected standard master model and conventional impression, and 10 PUT models were duplicated using the 3D subtractive RP technique with an oral scanner. Six linear measurements were evaluated in terms of x, y, and z-axes using a non-contact white light scanner. Accuracy was assessed using mean differences between two measurements, and precision was examined using four quantitative methods and the Bland-Altman graphical method. Repeatability was evaluated in terms of intra-examiner variability, and reproducibility was assessed in terms of inter-examiner and inter-method variability. The mean difference between plaster models and PUT models ranged from 0.07 mm to 0.33 mm. Relative measurement errors ranged from 2.2% to 7.6% and intraclass correlation coefficients ranged from 0.93 to 0.96, when comparing plaster models and PUT models. The Bland-Altman plot showed good agreement. The accuracy and precision of PUT dental models for evaluating the performance of oral scanner and subtractive RP technology was acceptable. Because of the recent improvements in block material and computerized numeric control milling machines, the subtractive RP method may be a good choice for dental arch models.
Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.
2015-01-01
Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504
Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.
Mørk, Søren; Holmes, Ian
2012-03-01
Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.
A framework for evaluating forest landscape model predictions using empirical data and knowledge
Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson; William D. Dijak; Qia Wang
2014-01-01
Evaluation of forest landscape model (FLM) predictions is indispensable to establish the credibility of predictions. We present a framework that evaluates short- and long-term FLM predictions at site and landscape scales. Site-scale evaluation is conducted through comparing raster cell-level predictions with inventory plot data whereas landscape-scale evaluation is...
Differences in Student Evaluations of Limited-Term Lecturers and Full-Time Faculty
ERIC Educational Resources Information Center
Cho, Jeong-Il; Otani, Koichiro; Kim, B. Joon
2014-01-01
This study compared student evaluations of teaching (SET) for limited-term lecturers (LTLs) and full-time faculty (FTF) using a Likert-scaled survey administered to students (N = 1,410) at the end of university courses. Data were analyzed using a general linear regression model to investigate the influence of multi-dimensional evaluation items on…
Estimating Model Prediction Error: Should You Treat Predictions as Fixed or Random?
NASA Technical Reports Server (NTRS)
Wallach, Daniel; Thorburn, Peter; Asseng, Senthold; Challinor, Andrew J.; Ewert, Frank; Jones, James W.; Rotter, Reimund; Ruane, Alexander
2016-01-01
Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEP fixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEP uncertain( X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEP uncertain (X) can be estimated using a random effects ANOVA. It is argued that MSEP uncertain (X) is the more informative uncertainty criterion, because it is specific to each prediction situation.
Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C
2015-11-01
Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.
Goals and Characteristics of Long-Term Care Programs: An Analytic Model.
ERIC Educational Resources Information Center
Braun, Kathryn L.; Rose, Charles L.
1989-01-01
Used medico-social analytic model to compare five long-term care programs: Skilled Nursing Facility-Intermediate Care Facility (SNF-ICF) homes, ICF homes, foster homes, day hospitals, and home care. Identified similarities and differences among programs. Preliminary findings suggest that model is useful in the evaluation and design of long-term…
An efficient soil water balance model based on hybrid numerical and statistical methods
NASA Astrophysics Data System (ADS)
Mao, Wei; Yang, Jinzhong; Zhu, Yan; Ye, Ming; Liu, Zhao; Wu, Jingwei
2018-04-01
Most soil water balance models only consider downward soil water movement driven by gravitational potential, and thus cannot simulate upward soil water movement driven by evapotranspiration especially in agricultural areas. In addition, the models cannot be used for simulating soil water movement in heterogeneous soils, and usually require many empirical parameters. To resolve these problems, this study derives a new one-dimensional water balance model for simulating both downward and upward soil water movement in heterogeneous unsaturated zones. The new model is based on a hybrid of numerical and statistical methods, and only requires four physical parameters. The model uses three governing equations to consider three terms that impact soil water movement, including the advective term driven by gravitational potential, the source/sink term driven by external forces (e.g., evapotranspiration), and the diffusive term driven by matric potential. The three governing equations are solved separately by using the hybrid numerical and statistical methods (e.g., linear regression method) that consider soil heterogeneity. The four soil hydraulic parameters required by the new models are as follows: saturated hydraulic conductivity, saturated water content, field capacity, and residual water content. The strength and weakness of the new model are evaluated by using two published studies, three hypothetical examples and a real-world application. The evaluation is performed by comparing the simulation results of the new model with corresponding results presented in the published studies, obtained using HYDRUS-1D and observation data. The evaluation indicates that the new model is accurate and efficient for simulating upward soil water flow in heterogeneous soils with complex boundary conditions. The new model is used for evaluating different drainage functions, and the square drainage function and the power drainage function are recommended. Computational efficiency of the new model makes it particularly suitable for large-scale simulation of soil water movement, because the new model can be used with coarse discretization in space and time.
Hoang, Van Phuong; Shanahan, Marian; Shukla, Nagesh; Perez, Pascal; Farrell, Michael; Ritter, Alison
2016-04-13
The overarching goal of health policies is to maximize health and societal benefits. Economic evaluations can play a vital role in assessing whether or not such benefits occur. This paper reviews the application of modelling techniques in economic evaluations of drug and alcohol interventions with regard to (i) modelling paradigms themselves; (ii) perspectives of costs and benefits and (iii) time frame. Papers that use modelling approaches for economic evaluations of drug and alcohol interventions were identified by carrying out searches of major databases. Thirty eight papers met the inclusion criteria. Overall, the cohort Markov models remain the most popular approach, followed by decision trees, Individual based model and System dynamics model (SD). Most of the papers adopted a long term time frame to reflect the long term costs and benefits of health interventions. However, it was fairly common among the reviewed papers to adopt a narrow perspective that only takes into account costs and benefits borne by the health care sector. This review paper informs policy makers about the availability of modelling techniques that can be used to enhance the quality of economic evaluations for drug and alcohol treatment interventions.
Kim, Jae-Hong; Kim, Ki-Baek; Kim, Woong-Chul; Kim, Ji-Hwan
2014-01-01
Objective This study aimed to evaluate the accuracy and precision of polyurethane (PUT) dental arch models fabricated using a three-dimensional (3D) subtractive rapid prototyping (RP) method with an intraoral scanning technique by comparing linear measurements obtained from PUT models and conventional plaster models. Methods Ten plaster models were duplicated using a selected standard master model and conventional impression, and 10 PUT models were duplicated using the 3D subtractive RP technique with an oral scanner. Six linear measurements were evaluated in terms of x, y, and z-axes using a non-contact white light scanner. Accuracy was assessed using mean differences between two measurements, and precision was examined using four quantitative methods and the Bland-Altman graphical method. Repeatability was evaluated in terms of intra-examiner variability, and reproducibility was assessed in terms of inter-examiner and inter-method variability. Results The mean difference between plaster models and PUT models ranged from 0.07 mm to 0.33 mm. Relative measurement errors ranged from 2.2% to 7.6% and intraclass correlation coefficients ranged from 0.93 to 0.96, when comparing plaster models and PUT models. The Bland-Altman plot showed good agreement. Conclusions The accuracy and precision of PUT dental models for evaluating the performance of oral scanner and subtractive RP technology was acceptable. Because of the recent improvements in block material and computerized numeric control milling machines, the subtractive RP method may be a good choice for dental arch models. PMID:24696823
Inanlouganji, Alireza; Reddy, T. Agami; Katipamula, Srinivas
2018-04-13
Forecasting solar irradiation has acquired immense importance in view of the exponential increase in the number of solar photovoltaic (PV) system installations. In this article, analyses results involving statistical and machine-learning techniques to predict solar irradiation for different forecasting horizons are reported. Yearlong typical meteorological year 3 (TMY3) datasets from three cities in the United States with different climatic conditions have been used in this analysis. A simple forecast approach that assumes consecutive days to be identical serves as a baseline model to compare forecasting alternatives. To account for seasonal variability and to capture short-term fluctuations, different variants of themore » lagged moving average (LMX) model with cloud cover as the input variable are evaluated. Finally, the proposed LMX model is evaluated against an artificial neural network (ANN) model. How the one-hour and 24-hour models can be used in conjunction to predict different short-term rolling horizons is discussed, and this joint application is illustrated for a four-hour rolling horizon forecast scheme. Lastly, the effect of using predicted cloud cover values, instead of measured ones, on the accuracy of the models is assessed. Results show that LMX models do not degrade in forecast accuracy if models are trained with the forecast cloud cover data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inanlouganji, Alireza; Reddy, T. Agami; Katipamula, Srinivas
Forecasting solar irradiation has acquired immense importance in view of the exponential increase in the number of solar photovoltaic (PV) system installations. In this article, analyses results involving statistical and machine-learning techniques to predict solar irradiation for different forecasting horizons are reported. Yearlong typical meteorological year 3 (TMY3) datasets from three cities in the United States with different climatic conditions have been used in this analysis. A simple forecast approach that assumes consecutive days to be identical serves as a baseline model to compare forecasting alternatives. To account for seasonal variability and to capture short-term fluctuations, different variants of themore » lagged moving average (LMX) model with cloud cover as the input variable are evaluated. Finally, the proposed LMX model is evaluated against an artificial neural network (ANN) model. How the one-hour and 24-hour models can be used in conjunction to predict different short-term rolling horizons is discussed, and this joint application is illustrated for a four-hour rolling horizon forecast scheme. Lastly, the effect of using predicted cloud cover values, instead of measured ones, on the accuracy of the models is assessed. Results show that LMX models do not degrade in forecast accuracy if models are trained with the forecast cloud cover data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Venkat; Cole, Wesley
Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions--native resolution (134 BAs), state-level, and NERCmore » region level--and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.« less
Performability evaluation of the SIFT computer
NASA Technical Reports Server (NTRS)
Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.
1979-01-01
Performability modeling and evaluation techniques are applied to the SIFT computer as it might operate in the computational evironment of an air transport mission. User-visible performance of the total system (SIFT plus its environment) is modeled as a random variable taking values in a set of levels of accomplishment. These levels are defined in terms of four attributes of total system behavior: safety, no change in mission profile, no operational penalties, and no economic process whose states describe the internal structure of SIFT as well as relavant conditions of the environment. Base model state trajectories are related to accomplishment levels via a capability function which is formulated in terms of a 3-level model hierarchy. Performability evaluation algorithms are then applied to determine the performability of the total system for various choices of computer and environment parameter values. Numerical results of those evaluations are presented and, in conclusion, some implications of this effort are discussed.
77 FR 75855 - Spirotetramat; Pesticide Tolerance for Emergency Exemption
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
..., grape and tomato juice, applesauce, and dried apples, and Dietary Exposure Evaluation Model (DEEM (Ver... into account short-term and intermediate- term residential exposure plus chronic exposure to food and... based on short-term or intermediate-term residential exposure plus chronic dietary exposure. Because...
NASA Astrophysics Data System (ADS)
Hu, W.; Si, B. C.
2013-10-01
Soil water content (SWC) varies in space and time. The objective of this study was to evaluate soil water content distribution using a statistical model. The model divides spatial SWC series into time-invariant spatial patterns, space-invariant temporal changes, and space- and time-dependent redistribution terms. The redistribution term is responsible for the temporal changes in spatial patterns of SWC. An empirical orthogonal function was used to separate the total variations of redistribution terms into the sum of the product of spatial structures (EOFs) and temporally-varying coefficients (ECs). Model performance was evaluated using SWC data of near-surface (0-0.2 m) and root-zone (0-1.0 m) from a Canadian Prairie landscape. Three significant EOFs were identified for redistribution term for both soil layers. EOF1 dominated the variations of redistribution terms and it resulted in more changes (recharge or discharge) in SWC at wetter locations. Depth to CaCO3 layer and organic carbon were the two most important controlling factors of EOF1, and together, they explained over 80% of the variations in EOF1. Weak correlation existed between either EOF2 or EOF3 and the observed factors. A reasonable prediction of SWC distribution was obtained with this model using cross validation. The model performed better in the root zone than in the near surface, and it outperformed conventional EOF method in case soil moisture deviated from the average conditions.
MUSE--Model for University Strategic Evaluation. AIR 2002 Forum Paper.
ERIC Educational Resources Information Center
Kutina, Kenneth L.; Zullig, Craig M.; Starkman, Glenn D.; Tanski, Laura E.
A model for simulating college and university operations, finances, program investments, and market response in terms of applicants, acceptances, and retention has been developed and implemented using the system dynamics approach. The Model for University Strategic Evaluation (MUSE) is a simulation of the total operations of the university,…
USDA-ARS?s Scientific Manuscript database
Soil carbon (C) models are important tools for examining complex interactions between climate, crop and soil management practices, and to evaluate the long-term effects of management practices on C-storage potential in soils. CQESTR is a process-based carbon balance model that relates crop residue a...
ERIC Educational Resources Information Center
Elbaz-Haddad, Merav; Savaya, Riki
2011-01-01
The article describes a psychosocial model of intervention with psychiatric patients in long-term hospitalization in a psychiatric ward in Israel and reports the findings of the evaluation conducted of its effectiveness. The model was aimed at maintaining or improving the patients' functioning in four main areas: personal hygiene, environmental…
Catalog of Wargaming and Military Simulation Models
1989-09-01
and newly developed software models. This system currently (and will in the near term) supports battle force architecture design and evaluation...aborted air refuelings, or replacement aircraft. PLANNED IMPROVEMENTS AND MODIFICATIONS: Completion of model. INPUT: Input fields are required to...vehicle mobility evaluation model). PROPONENT: Mobility Systems Division, Geotechnical Laboratory, U.S. Army Engineer Waterways Experiment Station
2017-10-01
Through analysis of data obtained in the Molecular Signatures of Chronic Pain Subtypes study termed Veterans Integrated Pain Evaluation Research...immune cells (macrophages) to chronic pain while also evaluating novel analgesics in relevant animal models. The current proposal also attempts to...analysis of data obtained in the Molecular Signatures of Chronic Pain Subtypes study termed Veterans Integrated Pain Evaluation Research (VIPER
A Universal Model for Evaluating Basic Electronic Courses in Terms of Field Utilization of Training.
ERIC Educational Resources Information Center
Air Force Occupational Measurement Center, Lackland AFB, TX.
The main purpose of the Air Force project was to develop a universal model to evaluate usage of basic electronic principles training. The criterion used by the model to evaluate electronic theory training is a determination of the usefulness of the training vis-a-vis the performance of assigned tasks in the various electronic career fields. Data…
NASA Technical Reports Server (NTRS)
De Boer, G.; Shupe, M.D.; Caldwell, P.M.; Bauer, Susanne E.; Persson, O.; Boyle, J.S.; Kelley, M.; Klein, S.A.; Tjernstrom, M.
2014-01-01
Atmospheric measurements from the Arctic Summer Cloud Ocean Study (ASCOS) are used to evaluate the performance of three atmospheric reanalyses (European Centre for Medium Range Weather Forecasting (ECMWF)- Interim reanalysis, National Center for Environmental Prediction (NCEP)-National Center for Atmospheric Research (NCAR) reanalysis, and NCEP-DOE (Department of Energy) reanalysis) and two global climate models (CAM5 (Community Atmosphere Model 5) and NASA GISS (Goddard Institute for Space Studies) ModelE2) in simulation of the high Arctic environment. Quantities analyzed include near surface meteorological variables such as temperature, pressure, humidity and winds, surface-based estimates of cloud and precipitation properties, the surface energy budget, and lower atmospheric temperature structure. In general, the models perform well in simulating large-scale dynamical quantities such as pressure and winds. Near-surface temperature and lower atmospheric stability, along with surface energy budget terms, are not as well represented due largely to errors in simulation of cloud occurrence, phase and altitude. Additionally, a development version of CAM5, which features improved handling of cloud macro physics, has demonstrated to improve simulation of cloud properties and liquid water amount. The ASCOS period additionally provides an excellent example of the benefits gained by evaluating individual budget terms, rather than simply evaluating the net end product, with large compensating errors between individual surface energy budget terms that result in the best net energy budget.
ERIC Educational Resources Information Center
Patton, Michael Quinn
2016-01-01
Fidelity concerns the extent to which a specific evaluation sufficiently incorporates the core characteristics of the overall approach to justify labeling that evaluation by its designated name. Fidelity has traditionally meant implementing a model in exactly the same way each time following the prescribed steps and procedures. The essential…
ERIC Educational Resources Information Center
Smith, Calvin
2008-01-01
This paper describes the development of a model for integrating student evaluation of teaching results with academic development opportunities, in new ways that take into account theoretical and practical developments in both fields. The model is described in terms of five phases or components: (1) the basic student evaluation system; (2) an…
Assessing modelled spatial distributions of ice water path using satellite data
NASA Astrophysics Data System (ADS)
Eliasson, S.; Buehler, S. A.; Milz, M.; Eriksson, P.; John, V. O.
2010-05-01
The climate models used in the IPCC AR4 show large differences in monthly mean cloud ice. The most valuable source of information that can be used to potentially constrain the models is global satellite data. For this, the data sets must be long enough to capture the inter-annual variability of Ice Water Path (IWP). PATMOS-x was used together with ISCCP for the annual cycle evaluation in Fig. 7 while ECHAM-5 was used for the correlation with other models in Table 3. A clear distinction between ice categories in satellite retrievals, as desired from a model point of view, is currently impossible. However, long-term satellite data sets may still be used to indicate the climatology of IWP spatial distribution. We evaluated satellite data sets from CloudSat, PATMOS-x, ISCCP, MODIS and MSPPS in terms of monthly mean IWP, to determine which data sets can be used to evaluate the climate models. IWP data from CloudSat cloud profiling radar provides the most advanced data set on clouds. As CloudSat data are too short to evaluate the model data directly, it was mainly used here to evaluate IWP from the other satellite data sets. ISCCP and MSPPS were shown to have comparatively low IWP values. ISCCP shows particularly low values in the tropics, while MSPPS has particularly low values outside the tropics. MODIS and PATMOS-x were in closest agreement with CloudSat in terms of magnitude and spatial distribution, with MODIS being the best of the two. As PATMOS-x extends over more than 25 years and is in fairly close agreement with CloudSat, it was chosen as the reference data set for the model evaluation. In general there are large discrepancies between the individual climate models, and all of the models show problems in reproducing the observed spatial distribution of cloud-ice. Comparisons consistently showed that ECHAM-5 is the GCM from IPCC AR4 closest to satellite observations.
The changing face of long-term care: looking at the past decade.
Ragsdale, Vickie; McDougall, Graham J
2008-09-01
Baby boomers on the verge of retirement who are considering future long-term care needs are searching for options that will promote comfort and quality of life in an environment comparable to the home left behind. Culture change is taking on different faces throughout long-term care, moving from a traditional medical model towards a holistic approach. New models of care address individual needs of the aging population. This article has three aims: (1) to evaluate the current state of culture change throughout long-term care, (2) to describe models of change seen among the long-term care industry, and (3) to report on existing work comparing the Green House Model of Care to two traditional nursing homes in Tupelo, Mississippi.
Self Evaluation of Organizations.
ERIC Educational Resources Information Center
Pooley, Richard C.
Evaluation within human service organizations is defined in terms of accepted evaluation criteria, with reasonable expectations shown and structured into a model of systematic evaluation practice. The evaluation criteria of program effort, performance, adequacy, efficiency and process mechanisms are discussed, along with measurement information…
DEVELOMENT AND EVALUATION OF A MODEL FOR ESTIMATING LONG-TERM AVERAGE OZONE EXPOSURES TO CHILDREN
Long-term average exposures of school-age children can be modelled using longitudinal measurements collected during the Harvard Southern California Chronic Ozone Exposure Study over a 12-month period: June, 1995-May, 1996. The data base contains over 200 young children with perso...
The Potential Consequence of Using Value-Added Models to Evaluate Teachers
ERIC Educational Resources Information Center
Shen, Zuchao; Simon, Carlee Escue; Kelcey, Ben
2016-01-01
Value-added models try to separate the contribution of individual teachers or schools to students' learning growth measured by standardized test scores. There is a policy trend to use value-added modeling to evaluate teachers because of its face validity and superficial objectiveness. This article investigates the potential long term consequences…
Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames
NASA Astrophysics Data System (ADS)
Heye, Colin; Raman, Venkat
2012-11-01
A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.
Evaluation of effects of long term exposure on lethal toxicity with mammals.
Verma, Vibha; Yu, Qiming J; Connell, Des W
2014-02-01
The relationship between exposure time (LT50) and lethal exposure concentration (LC50) has been evaluated over relatively long exposure times using a novel parameter, Normal Life Expectancy (NLT), as a long term toxicity point. The model equation, ln(LT50) = aLC50(ν) + b, where a, b and ν are constants, was evaluated by plotting lnLT50 against LC50 using available toxicity data based on inhalation exposure from 7 species of mammals. With each specific toxicant a single consistent relationship was observed for all mammals with ν always <1. Use of NLT as a long term toxicity point provided a valuable limiting point for long exposure times. With organic compounds, the Kow can be used to calculate the model constants a and v where these are unknown. The model can be used to characterise toxicity to specific mammals and then be extended to estimate toxicity at any exposure time with other mammals. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Program Evaluation: An Overview.
ERIC Educational Resources Information Center
McCluskey, Lawrence
1973-01-01
Various models of educational evaluation are presented. These include: (1) the classical type model, which contains the following guidelines: formulate objectives, classify objectives, define objectives in behavioral terms, suggest situations in which achievement of objectives will be shown, develop or select appraisal techniques, and gather and…
Cogswell, Rebecca; Kobashigawa, Erin; McGlothlin, Dana; Shaw, Robin; De Marco, Teresa
2012-11-01
The Registry to Evaluate Early and Long-Term Pulmonary Arterial (PAH) Hypertension Disease Management (REVEAL) model was designed to predict 1-year survival in patients with PAH. Multivariate prediction models need to be evaluated in cohorts distinct from the derivation set to determine external validity. In addition, limited data exist on the utility of this model in the prediction of long-term survival. REVEAL model performance was assessed to predict 1-year and 5-year outcomes, defined as survival or composite survival or freedom from lung transplant, in 140 patients with PAH. The validation cohort had a higher proportion of human immunodeficiency virus (7.9% vs 1.9%, p < 0.0001), methamphetamine use (19.3% vs 4.9%, p < 0.0001), and portal hypertension PAH (16.4% vs 5.1%, p < 0.0001) compared with the development cohort. The C-index of the model to predict survival was 0.765 at 1 year and 0.712 at 5 years of follow-up. The C-index of the model to predict composite survival or freedom from lung transplant was 0.805 and 0.724 at 1 and 5 years of follow-up, respectively. Prediction by the model, however, was weakest among patients with intermediate-risk predicted survival. The REVEAL model had adequate discrimination to predict 1-year survival in this small but clinically distinct validation cohort. Although the model also had predictive ability out to 5 years, prediction was limited among patients of intermediate risk, suggesting our prediction methods can still be improved. Copyright © 2012. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Kniffka, Anke; Benedetti, Angela; Knippertz, Peter; Stanelle, Tanja; Brooks, Malcolm; Deetz, Konrad; Maranan, Marlon; Rosenberg, Philip; Pante, Gregor; Allan, Richard; Hill, Peter; Adler, Bianca; Fink, Andreas; Kalthoff, Norbert; Chiu, Christine; Vogel, Bernhard; Field, Paul; Marsham, John
2017-04-01
DACCIWA (Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa) is an EU-funded project that aims to determine the influence of anthropogenic and natural emissions on the atmospheric composition, air quality, weather and climate over southern West Africa. DACCIWA organised a major international field campaign in June-July 2016 and involves a wide range of modelling activities. Here we report about the coordinated model evaluation performed in the framework of DACCIWA focusing on meteorological fields. This activity consists of two elements: (a) the quality of numerical weather prediction during the field campaign, (b) the ability of seasonal and climate models to represent the mean state and its variability. For the first element, the extensive observations from the main field campaign in West Africa in June-July 2016 (ground supersites, radiosondes, aircraft measurements) will be combined with conventional data (synoptic stations, satellites data from various sensors) to evaluate models against. The forecasts include operational products from centres such as the ECMWF, UK MetOffice and the German Weather Service and runs specifically conducted for the planning and the post-analysis of the field campaign using higher resolutions (e.g., WRF, COSMO). The forecast and the observations are analysed in a concerted way to assess the ability of the models to represent the southern West African weather systems and secondly to provide a comprehensive synoptic overview of the state of the atmosphere. In a second step the process will be extended to long-term modelling periods. This includes both seasonal and climate models, respectively. In this case, the observational dataset contains long-term satellite observations and station data, some of which were digitised from written records in the framework of DACCIWA. Parameter choice and spatial averaging will build directly on the weather forecasting evaluation to allow an assessment of the impact of short-term errors on long-term simulations.
Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation
ERIC Educational Resources Information Center
Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom
2014-01-01
Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…
NASA Technical Reports Server (NTRS)
Alexandrov, N. M.; Nielsen, E. J.; Lewis, R. M.; Anderson, W. K.
2000-01-01
First-order approximation and model management is a methodology for a systematic use of variable-fidelity models or approximations in optimization. The intent of model management is to attain convergence to high-fidelity solutions with minimal expense in high-fidelity computations. The savings in terms of computationally intensive evaluations depends on the ability of the available lower-fidelity model or a suite of models to predict the improvement trends for the high-fidelity problem, Variable-fidelity models can be represented by data-fitting approximations, variable-resolution models. variable-convergence models. or variable physical fidelity models. The present work considers the use of variable-fidelity physics models. We demonstrate the performance of model management on an aerodynamic optimization of a multi-element airfoil designed to operate in the transonic regime. Reynolds-averaged Navier-Stokes equations represent the high-fidelity model, while the Euler equations represent the low-fidelity model. An unstructured mesh-based analysis code FUN2D evaluates functions and sensitivity derivatives for both models. Model management for the present demonstration problem yields fivefold savings in terms of high-fidelity evaluations compared to optimization done with high-fidelity computations alone.
A Model-Based Approach to Trial-By-Trial P300 Amplitude Fluctuations
Kolossa, Antonio; Fingscheidt, Tim; Wessel, Karl; Kopp, Bruno
2013-01-01
It has long been recognized that the amplitude of the P300 component of event-related brain potentials is sensitive to the degree to which eliciting stimuli are surprising to the observers (Donchin, 1981). While Squires et al. (1976) showed and modeled dependencies of P300 amplitudes from observed stimuli on various time scales, Mars et al. (2008) proposed a computational model keeping track of stimulus probabilities on a long-term time scale. We suggest here a computational model which integrates prior information with short-term, long-term, and alternation-based experiential influences on P300 amplitude fluctuations. To evaluate the new model, we measured trial-by-trial P300 amplitude fluctuations in a simple two-choice response time task, and tested the computational models of trial-by-trial P300 amplitudes using Bayesian model evaluation. The results reveal that the new digital filtering (DIF) model provides a superior account of the trial-by-trial P300 amplitudes when compared to both Squires et al.’s (1976) model, and Mars et al.’s (2008) model. We show that the P300-generating system can be described as two parallel first-order infinite impulse response (IIR) low-pass filters and an additional fourth-order finite impulse response (FIR) high-pass filter. Implications of the acquired data are discussed with regard to the neurobiological distinction between short-term, long-term, and working memory as well as from the point of view of predictive coding models and Bayesian learning theories of cortical function. PMID:23404628
NASA Astrophysics Data System (ADS)
LI, Y.; Castelletti, A.; Giuliani, M.
2014-12-01
Over recent years, long-term climate forecast from global circulation models (GCMs) has been demonstrated to show increasing skills over the climatology, thanks to the advances in the modelling of coupled ocean-atmosphere dynamics. Improved information from long-term forecast is supposed to be a valuable support to farmers in optimizing farming operations (e.g. crop choice, cropping time) and for more effectively coping with the adverse impacts of climate variability. Yet, evaluating how valuable this information can be is not straightforward and farmers' response must be taken into consideration. Indeed, while long-range forecast are traditionally evaluated in terms of accuracy by comparison of hindcast and observed values, in the context of agricultural systems, potentially useful forecast information should alter the stakeholders' expectation, modify their decisions and ultimately have an impact on their annual benefit. Therefore, it is more desirable to assess the value of those long-term forecasts via decision-making models so as to extract direct indication of probable decision outcomes from farmers, i.e. from an end-to-end perspective. In this work, we evaluate the operational value of thirteen state-of-the-art long-range forecast ensembles against climatology forecast and subjective prediction (i.e. past year climate and historical average) within an integrated agronomic modeling framework embedding an implicit model of farmers' behavior. Collected ensemble datasets are bias-corrected and downscaled using a stochastic weather generator, in order to address the mismatch of the spatio-temporal scale between forecast data from GCMs and distributed crop simulation model. The agronomic model is first simulated using the forecast information (ex-ante), followed by a second run with actual climate (ex-post). Multi-year simulations are performed to account for climate variability and the value of the different climate forecast is evaluated against the perfect foresight scenario based on the expected crop productivity as well as the land-use decisions. Our results show that not all the products generate beneficial effects to farmers and that the forecast errors might be amplified by the farmers decisions.
A multi-model assessment of the co-benefits of climate mitigation for global air quality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana
The recent International Panel on Climate change (IPCC) report identifies significant co-benefits from climate policies on near-term ambient air pollution and related human health outcomes [1]. This is increasingly relevant for policy making as the health impacts of air pollution are a major global concern- the Global Burden of Disease (GBD) study identifies outdoor air pollution as the sixth major cause of death globally [2]. Integrated assessment models (IAMs) are an effective tool to evaluate future air pollution outcomes across a wide range of assumptions on socio-economic development and policy regimes. The Representative Concentration Pathways (RCPs) [3] were the firstmore » set of long-term global scenarios developed across multiple integrated assessment models that provided detailed estimates of a number of air pollutants until 2100. However these scenarios were primarily designed to cover a defined range of radiative forcing outcomes and thus did not specifically focus on the interactions of long-term climate goals on near-term air pollution impacts. More recently, [4] used the RCP4.5 scenario to evaluate the co-benefits of global GHG reductions on air quality and human health in 2030. [5-7] have further examined the interactions of more diverse pollution control regimes with climate policies. This paper extends the listed studies in a number of ways. Firstly it uses multiple IAMs to look into the co-benefits of a global climate policy for ambient air pollution under harmonized assumptions on near-term air pollution control. Multi-model frameworks have been extensively used in the analysis of climate change mitigation pathways, and the structural uncertainties regarding the underlying mechanisms (see for example [8-10]. This is to our knowledge the first time that a multi-model evaluation has been specifically designed and applied to analyze the co-benefits of climate change policy on ambient air quality, thus enabling a better understanding of at a detailed sector and region level. A second methodological advancement is a quantification of the co-benefits in terms of the associated atmospheric concentrations of fine particulate matter (PM2.5) and consequent mortality related outcomes across different models. This is made possible by the use of state-of the art simplified atmospheric model that allows for the first time a computationally feasible multi-model evaluation of such outcomes.« less
Seismic hazard assessment over time: Modelling earthquakes in Taiwan
NASA Astrophysics Data System (ADS)
Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting
2017-04-01
To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.
Logic Models: A Tool for Designing and Monitoring Program Evaluations. REL 2014-007
ERIC Educational Resources Information Center
Lawton, Brian; Brandon, Paul R.; Cicchinelli, Louis; Kekahio, Wendy
2014-01-01
introduction to logic models as a tool for designing program evaluations defines the major components of education programs--resources, activities, outputs, and short-, mid-, and long-term outcomes--and uses an example to demonstrate the relationships among them. This quick…
Field evaluations of a forestry version of DRAINMOD-NII model
S. Tian; M. A. Youssef; R.W. Skaggs; D.M. Amatya; G.M. Chescheir
2010-01-01
This study evaluated the performance of the newly developed forestry version of DRAINMOD-NII model using a long term (21-year) data set collected from an artificially drained loblolly pine (Pinus taeda L.) plantation in eastern North Carolina, U.S.A. The model simulates the main hydrological and biogeochemical processes in drained forested lands. The...
Evaluation of coral reef carbonate production models at a global scale
NASA Astrophysics Data System (ADS)
Jones, N. S.; Ridgwell, A.; Hendy, E. J.
2014-09-01
Calcification by coral reef communities is estimated to account for half of all carbonate produced in shallow water environments and more than 25% of the total carbonate buried in marine sediments globally. Production of calcium carbonate by coral reefs is therefore an important component of the global carbon cycle. It is also threatened by future global warming and other global change pressures. Numerical models of reefal carbonate production are essential for understanding how carbonate deposition responds to environmental conditions including future atmospheric CO2 concentrations, but these models must first be evaluated in terms of their skill in recreating present day calcification rates. Here we evaluate four published model descriptions of reef carbonate production in terms of their predictive power, at both local and global scales, by comparing carbonate budget outputs with independent estimates. We also compile available global data on reef calcification to produce an observation-based dataset for the model evaluation. The four calcification models are based on functions sensitive to combinations of light availability, aragonite saturation (Ωa) and temperature and were implemented within a specifically-developed global framework, the Global Reef Accretion Model (GRAM). None of the four models correlated with independent rate estimates of whole reef calcification. The temperature-only based approach was the only model output to significantly correlate with coral-calcification rate observations. The absence of any predictive power for whole reef systems, even when consistent at the scale of individual corals, points to the overriding importance of coral cover estimates in the calculations. Our work highlights the need for an ecosystem modeling approach, accounting for population dynamics in terms of mortality and recruitment and hence coral cover, in estimating global reef carbonate budgets. In addition, validation of reef carbonate budgets is severely hampered by limited and inconsistent methodology in reef-scale observations.
Evaluating Modeled Impact Metrics for Human Health, Agriculture Growth, and Near-Term Climate
NASA Astrophysics Data System (ADS)
Seltzer, K. M.; Shindell, D. T.; Faluvegi, G.; Murray, L. T.
2017-12-01
Simulated metrics that assess impacts on human health, agriculture growth, and near-term climate were evaluated using ground-based and satellite observations. The NASA GISS ModelE2 and GEOS-Chem models were used to simulate the near-present chemistry of the atmosphere. A suite of simulations that varied by model, meteorology, horizontal resolution, emissions inventory, and emissions year were performed, enabling an analysis of metric sensitivities to various model components. All simulations utilized consistent anthropogenic global emissions inventories (ECLIPSE V5a or CEDS), and an evaluation of simulated results were carried out for 2004-2006 and 2009-2011 over the United States and 2014-2015 over China. Results for O3- and PM2.5-based metrics featured minor differences due to the model resolutions considered here (2.0° × 2.5° and 0.5° × 0.666°) and model, meteorology, and emissions inventory each played larger roles in variances. Surface metrics related to O3 were consistently high biased, though to varying degrees, demonstrating the need to evaluate particular modeling frameworks before O3 impacts are quantified. Surface metrics related to PM2.5 were diverse, indicating that a multimodel mean with robust results are valuable tools in predicting PM2.5-related impacts. Oftentimes, the configuration that captured the change of a metric best over time differed from the configuration that captured the magnitude of the same metric best, demonstrating the challenge in skillfully simulating impacts. These results highlight the strengths and weaknesses of these models in simulating impact metrics related to air quality and near-term climate. With such information, the reliability of historical and future simulations can be better understood.
On modelling the pressure-strain correlations in wall bounded flows
NASA Technical Reports Server (NTRS)
Peltier, L. J.; Biringen, S.
1990-01-01
Turbulence models for the pressure-strain term of the Reynolds-stress equations in the vicinity of a moving wall are evaluated for a high Reynolds number flow using decaying grid turbulence as a model problem. The data of Thomas and Hancock are used as a base for evaluating the different turbulence models. In particular, the Rotta model for return-to-isotropy is evaluated both in its inclusion into the Reynolds-stress equation model and in comparison to a nonlinear model advanced by Sarkar and Speziale. Further, models for the wall correction to the transfer term advanced by Launder et al., Shir, and Shih and Lumley are compared. Initial data using the decaying grid turbulence experiment as a base suggests that the coefficients proposed for these models are high perhaps by as much as an order of magnitude. The Shih and Lumley model which satisfies realizability constraints, in particular, seems to hold promise in adequately modeling the Reynolds stress components of this flow. Extensions of this work are to include testing the homogeneous transfer model by Shih and Lumley and the testing of the wall transfer models using their proposed coefficients and the coefficients chosen from this work in a flow with mean shear component.
NASA Astrophysics Data System (ADS)
Perez-Poch, Antoni
Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numercial Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular archi-tecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electrical-like model of this control system, using inexpensive development frameworks, and has been tested and validated with the available experimental data. The objective of this work is to analyse and simulate long-term effects and gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairement which may put in jeopardy a long-term mission is also evaluated. . Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying continuosly from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobic ex-ercise and thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Different scenarios like a long-term mission to Moon or Mars are evaluated, including countermeasures such as aerobic exercise. Initial results are compatible with the existing data, and provide useful insights regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions.
Toward Standardizing a Lexicon of Infectious Disease Modeling Terms.
Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M; Moghadas, Seyed M
2016-01-01
Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models' assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain.
Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools
This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...
Illa, Miriam; Eixarch, Elisenda; Batalle, Dafnis; Arbat-Plana, Ariadna; Muñoz-Moreno, Emma; Figueras, Francesc; Gratacos, Eduard
2013-01-01
Background Intrauterine growth restriction (IUGR) affects 5–10% of all newborns and is associated with increased risk of memory, attention and anxiety problems in late childhood and adolescence. The neurostructural correlates of long-term abnormal neurodevelopment associated with IUGR are unknown. Thus, the aim of this study was to provide a comprehensive description of the long-term functional and neurostructural correlates of abnormal neurodevelopment associated with IUGR in a near-term rabbit model (delivered at 30 days of gestation) and evaluate the development of quantitative imaging biomarkers of abnormal neurodevelopment based on diffusion magnetic resonance imaging (MRI) parameters and connectivity. Methodology At +70 postnatal days, 10 cases and 11 controls were functionally evaluated with the Open Field Behavioral Test which evaluates anxiety and attention and the Object Recognition Task that evaluates short-term memory and attention. Subsequently, brains were collected, fixed and a high resolution MRI was performed. Differences in diffusion parameters were analyzed by means of voxel-based and connectivity analysis measuring the number of fibers reconstructed within anxiety, attention and short-term memory networks over the total fibers. Principal Findings The results of the neurobehavioral and cognitive assessment showed a significant higher degree of anxiety, attention and memory problems in cases compared to controls in most of the variables explored. Voxel-based analysis (VBA) revealed significant differences between groups in multiple brain regions mainly in grey matter structures, whereas connectivity analysis demonstrated lower ratios of fibers within the networks in cases, reaching the statistical significance only in the left hemisphere for both networks. Finally, VBA and connectivity results were also correlated with functional outcome. Conclusions The rabbit model used reproduced long-term functional impairments and their neurostructural correlates of abnormal neurodevelopment associated with IUGR. The description of the pattern of microstructural changes underlying functional defects may help to develop biomarkers based in diffusion MRI and connectivity analysis. PMID:24143189
Illa, Miriam; Eixarch, Elisenda; Batalle, Dafnis; Arbat-Plana, Ariadna; Muñoz-Moreno, Emma; Figueras, Francesc; Gratacos, Eduard
2013-01-01
Intrauterine growth restriction (IUGR) affects 5-10% of all newborns and is associated with increased risk of memory, attention and anxiety problems in late childhood and adolescence. The neurostructural correlates of long-term abnormal neurodevelopment associated with IUGR are unknown. Thus, the aim of this study was to provide a comprehensive description of the long-term functional and neurostructural correlates of abnormal neurodevelopment associated with IUGR in a near-term rabbit model (delivered at 30 days of gestation) and evaluate the development of quantitative imaging biomarkers of abnormal neurodevelopment based on diffusion magnetic resonance imaging (MRI) parameters and connectivity. At +70 postnatal days, 10 cases and 11 controls were functionally evaluated with the Open Field Behavioral Test which evaluates anxiety and attention and the Object Recognition Task that evaluates short-term memory and attention. Subsequently, brains were collected, fixed and a high resolution MRI was performed. Differences in diffusion parameters were analyzed by means of voxel-based and connectivity analysis measuring the number of fibers reconstructed within anxiety, attention and short-term memory networks over the total fibers. The results of the neurobehavioral and cognitive assessment showed a significant higher degree of anxiety, attention and memory problems in cases compared to controls in most of the variables explored. Voxel-based analysis (VBA) revealed significant differences between groups in multiple brain regions mainly in grey matter structures, whereas connectivity analysis demonstrated lower ratios of fibers within the networks in cases, reaching the statistical significance only in the left hemisphere for both networks. Finally, VBA and connectivity results were also correlated with functional outcome. The rabbit model used reproduced long-term functional impairments and their neurostructural correlates of abnormal neurodevelopment associated with IUGR. The description of the pattern of microstructural changes underlying functional defects may help to develop biomarkers based in diffusion MRI and connectivity analysis.
Partnering with parents in a pediatric ambulatory care setting: a new model.
Tourigny, Jocelyne; Chartrand, Julie
2015-06-01
Pediatric care has greatly evolved during the past 30 years, moving from a traditional, medically oriented approach to a more consultative, interactive model. In the literature, the concept of partnership has been explored and presented in various terms, including presence, collaboration, involvement, and participation. The models of partnership that have been proposed have rarely been evaluated, and do not take the unique environment of ambulatory care into account. Based on a literature review, strong clinical experience with families, and previous research with parents and health professionals, both the conceptual and empirical phases of a new model are described. This model can be adapted to other pediatric health care contexts in either primary or tertiary care and should be evaluated in terms of efficacy and usefulness.
NASA Astrophysics Data System (ADS)
Simpson, Mike; Ives, Matthew; Hall, Jim
2016-04-01
There is an increasing body of evidence in support of the use of nature based solutions as a strategy to mitigate drought. Restored or constructed wetlands, grasslands and in some cases forests have been used with success in numerous case studies. Such solutions remain underused in the UK, where they are not considered as part of long-term plans for supply by water companies. An important step is the translation of knowledge on the benefits of nature based solutions at the upland/catchment scale into a model of the impact of these solutions on national water resource planning in terms of financial costs, carbon benefits and robustness to drought. Our project, 'A National Scale Model of Green Infrastructure for Water Resources', addresses this issue through development of a model that can show the costs and benefits associated with a broad roll-out of nature based solutions for water supply. We have developed generalised models of both the hydrological effects of various classes and implementations of nature-based approaches and their economic impacts in terms of construction costs, running costs, time to maturity, land use and carbon benefits. Our next step will be to compare this work with our recent evaluation of conventional water infrastructure, allowing a case to be made in financial terms and in terms of security of water supply. By demonstrating the benefits of nature based solutions under multiple possible climate and population scenarios we aim to demonstrate the potential value of using nature based solutions as a component of future long-term water resource plans. Strategies for decision making regarding the selection of nature based and conventional approaches, developed through discussion with government and industry, will be applied to the final model. Our focus is on keeping our work relevant to the requirements of decision-makers involved in conventional water planning. We propose to present the outcomes of our model for the evaluation of nature-based solutions at catchment scale and ongoing results of our national-scale model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.
Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable–region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observationalmore » dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.« less
Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.
2017-11-29
Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable–region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observationalmore » dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.« less
NASA Astrophysics Data System (ADS)
Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.
2017-11-01
Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable-region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observational dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.
Short-term energy outlook. Volume 2. Methodology
NASA Astrophysics Data System (ADS)
1983-05-01
Recent changes in forecasting methodology for nonutility distillate fuel oil demand and for the near-term petroleum forecasts are discussed. The accuracy of previous short-term forecasts of most of the major energy sources published in the last 13 issues of the Outlook is evaluated. Macroeconomic and weather assumptions are included in this evaluation. Energy forecasts for 1983 are compared. Structural change in US petroleum consumption, the use of appropriate weather data in energy demand modeling, and petroleum inventories, imports, and refinery runs are discussed.
ERIC Educational Resources Information Center
Avci, Ahmet
2016-01-01
The aim of this study is to investigate teachers' perceptions of organizational citizenship behaviors and to evaluate them in terms of educational administration. Descriptive survey model was used in the research. The data of the research were obtained from 1,613 teachers working in public and private schools subjected to Ministry of National…
Evaluating Internal Model Strength and Performance of Myoelectric Prosthesis Control Strategies.
Shehata, Ahmed W; Scheme, Erik J; Sensinger, Jonathon W
2018-05-01
On-going developments in myoelectric prosthesis control have provided prosthesis users with an assortment of control strategies that vary in reliability and performance. Many studies have focused on improving performance by providing feedback to the user but have overlooked the effect of this feedback on internal model development, which is key to improve long-term performance. In this paper, the strength of internal models developed for two commonly used myoelectric control strategies: raw control with raw feedback (using a regression-based approach) and filtered control with filtered feedback (using a classifier-based approach), were evaluated using two psychometric measures: trial-by-trial adaptation and just-noticeable difference. The performance of both strategies was also evaluated using Schmidt's style target acquisition task. Results obtained from 24 able-bodied subjects showed that although filtered control with filtered feedback had better short-term performance in path efficiency ( ), raw control with raw feedback resulted in stronger internal model development ( ), which may lead to better long-term performance. Despite inherent noise in the control signals of the regression controller, these findings suggest that rich feedback associated with regression control may be used to improve human understanding of the myoelectric control system.
Toward Standardizing a Lexicon of Infectious Disease Modeling Terms
Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M.; Moghadas, Seyed M.
2016-01-01
Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models’ assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain. PMID:27734014
Sarrafzadegan, Nizal; Kelishad, Roya; Rabiei, Katayoun; Abedi, Heidarali; Mohaseli, Khadijeh Fereydoun; Masooleh, Hasan Azaripour; Alavi, Mousa; Heidari, Gholamreza; Ghaffari, Mostafa; O’Loughlin, Jennifer
2012-01-01
Background: Iran is one of the countries that has ratified the World Health Organization Framework Convention of Tobacco Control (WHO-FCTC), and has implemented a series of tobacco control interventions including the Comprehensive Tobacco Control Law. Enforcement of this legislation and assessment of its outcome requires a dedicated evaluation system. This study aimed to develop a generic model to evaluate the implementation of the Comprehensive Tobacco Control Law in Iran that was provided based on WHO-FCTC articles. Materials and Methods: Using a grounded theory approach, qualitative data were collected from 265 subjects in individual interviews and focus group discussions with policymakers who designed the legislation, key stakeholders, and members of the target community. In addition, field observations data in supermarkets/shops, restaurants, teahouses and coffee shops were collected. Data were analyzed in two stages through conceptual theoretical coding. Findings: Overall, 617 open codes were extracted from the data into tables; 72 level-3 codes were retained from the level-2 code series. Using a Model Met paradigm, the relationships between the components of each paradigm were depicted graphically. The evaluation model entailed three levels, namely: short-term results, process evaluation and long-term results. Conclusions: Central concept of the process of evaluation is that enforcing the law influences a variety of internal and environmental factors including legislative changes. These factors will be examined during the process evaluation and context evaluation. The current model can be applicable for providing FCTC evaluation tools across other jurisdictions. PMID:23833621
Evaluation of a Linear Cumulative Damage Failure Model for Epoxy Adhesive
NASA Technical Reports Server (NTRS)
Richardson, David E.; Batista-Rodriquez, Alicia; Macon, David; Totman, Peter; McCool, Alex (Technical Monitor)
2001-01-01
Recently a significant amount of work has been conducted to provide more complex and accurate material models for use in the evaluation of adhesive bondlines. Some of this has been prompted by recent studies into the effects of residual stresses on the integrity of bondlines. Several techniques have been developed for the analysis of bondline residual stresses. Key to these analyses is the criterion that is used for predicting failure. Residual stress loading of an adhesive bondline can occur over the life of the component. For many bonded systems, this can be several years. It is impractical to directly characterize failure of adhesive bondlines under a constant load for several years. Therefore, alternative approaches for predictions of bondline failures are required. In the past, cumulative damage failure models have been developed. These models have ranged from very simple to very complex. This paper documents the generation and evaluation of some of the most simple linear damage accumulation tensile failure models for an epoxy adhesive. This paper shows how several variations on the failure model were generated and presents an evaluation of the accuracy of these failure models in predicting creep failure of the adhesive. The paper shows that a simple failure model can be generated from short-term failure data for accurate predictions of long-term adhesive performance.
Integrating Human Factors into Crew Exploration Vehicle (CEV) Design
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Holden, Kritina; Baggerman, Susan; Campbell, Paul
2007-01-01
The purpose of this design process is to apply Human Engineering (HE) requirements and guidelines to hardware/software and to provide HE design, analysis and evaluation of crew interfaces. The topics include: 1) Background/Purpose; 2) HE Activities; 3) CASE STUDY: Net Habitable Volume (NHV) Study; 4) CASE STUDY: Human Modeling Approach; 5) CASE STUDY: Human Modeling Results; 6) CASE STUDY: Human Modeling Conclusions; 7) CASE STUDY: Human-in-the-Loop Evaluation Approach; 8) CASE STUDY: Unsuited Evaluation Results; 9) CASE STUDY: Suited Evaluation Results; 10) CASE STUDY: Human-in-the-Loop Evaluation Conclusions; 11) Near-Term Plan; and 12) In Conclusion
NASA Astrophysics Data System (ADS)
Dang Chien, Nguyen; Shih, Chun-Hsing; Hoa, Phu Chi; Minh, Nguyen Hong; Thi Thanh Hien, Duong; Nhung, Le Hong
2016-06-01
The two-band Kane model has been popularly used to calculate the band-to-band tunneling (BTBT) current in tunnel field-effect transistor (TFET) which is currently considered as a promising candidate for low power applications. This study theoretically clarifies the maximum electric field approximation (MEFA) of direct BTBT Kane model and evaluates its appropriateness for low bandgap semiconductors. By analysing the physical origin of each electric field term in the Kane model, it has been elucidated in the MEFA that the local electric field term must be remained while the nonlocal electric field terms are assigned by the maximum value of electric field at the tunnel junction. Mathematical investigations have showed that the MEFA is more appropriate for low bandgap semiconductors compared to high bandgap materials because of enhanced tunneling probability in low field regions. The appropriateness of the MEFA is very useful for practical uses in quickly estimating the direct BTBT current in low bandgap TFET devices.
NASA Astrophysics Data System (ADS)
Civerolo, Kevin; Hogrefe, Christian; Zalewsky, Eric; Hao, Winston; Sistla, Gopal; Lynn, Barry; Rosenzweig, Cynthia; Kinney, Patrick L.
2010-10-01
This paper compares spatial and seasonal variations and temporal trends in modeled and measured concentrations of sulfur and nitrogen compounds in wet and dry deposition over an 18-year period (1988-2005) over a portion of the northeastern United States. Substantial emissions reduction programs occurred over this time period, including Title IV of the Clean Air Act Amendments of 1990 which primarily resulted in large decreases in sulfur dioxide (SO 2) emissions by 1995, and nitrogen oxide (NO x) trading programs which resulted in large decreases in warm season NO x emissions by 2004. Additionally, NO x emissions from mobile sources declined more gradually over this period. The results presented here illustrate the use of both operational and dynamic model evaluation and suggest that the modeling system largely captures the seasonal and long-term changes in sulfur compounds. The modeling system generally captures the long-term trends in nitrogen compounds, but does not reproduce the average seasonal variation or spatial patterns in nitrate.
Long-Term Post-CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions.
Carr, Brendan M; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C; Zhu, Wei; Shroyer, A Laurie
2016-01-01
Clinical risk models are commonly used to predict short-term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long-term mortality. The added value of long-term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long-term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Long-term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c-index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Mortality rates were 3%, 9%, and 17% at one-, three-, and five years, respectively (median follow-up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long-term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Long-term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long-term mortality risk can be accurately assessed and subgroups of higher-risk patients can be identified for enhanced follow-up care. More research appears warranted to refine long-term CABG clinical risk models. © 2015 The Authors. Journal of Cardiac Surgery Published by Wiley Periodicals, Inc.
Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions
Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei
2015-01-01
Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019
Wang, Kai; Mao, Jiafu; Dickinson, Robert; ...
2013-06-05
This paper examines a land surface solar radiation partitioning scheme, i.e., that of the Community Land Model version 4 (CLM4) with coupled carbon and nitrogen cycles. Taking advantage of a unique 30-year fraction of absorbed photosynthetically active radiation (FPAR) dataset derived from the Global Inventory Modeling and Mapping Studies (GIMMS) normalized difference vegetation index (NDVI) data set, multiple other remote sensing datasets, and site level observations, we evaluated the CLM4 FPAR ’s seasonal cycle, diurnal cycle, long-term trends and spatial patterns. These findings show that the model generally agrees with observations in the seasonal cycle, long-term trends, and spatial patterns,more » but does not reproduce the diurnal cycle. Discrepancies also exist in seasonality magnitudes, peak value months, and spatial heterogeneity. Here, we identify the discrepancy in the diurnal cycle as, due to, the absence of dependence on sun angle in the model. Implementation of sun angle dependence in a one-dimensional (1-D) model is proposed. The need for better relating of vegetation to climate in the model, indicated by long-term trends, is also noted. Evaluation of the CLM4 land surface solar radiation partitioning scheme using remote sensing and site level FPAR datasets provides targets for future development in its representation of this naturally complicated process.« less
Hippert, Henrique S; Taylor, James W
2010-04-01
Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.
A critical evaluation of two-equation models for near wall turbulence
NASA Technical Reports Server (NTRS)
Speziale, Charles G.; Anderson, E. Clay; Abid, Ridha
1990-01-01
A basic theoretical and computational study of two-equation models for near-wall turbulent flows was conducted. Two major problems established for the K-epsilon model are discussed, the lack of natural boundary conditions for the dissipation rate and the appearance of higher-order correlations in the balance of terms for the dissipation rate at the wall. The K-omega equation is shown to have two problems also: an exact viscous term is missing, and the destruction of the dissipation term is not properly damped near the wall. A new K-tau model (where tau = 1/omega is the turbulent time scale) was developed by inclusion of the exact viscous term, and by introduction of new wall damping functions with improved asymptotic behavior. A preliminary test of the new model yields improved predictions for the flat-plate turbulent boundary layer.
ERIC Educational Resources Information Center
Avci, Ahmet
2015-01-01
The aim of this study is to investigate the transformational and transactional leadership styles of school principals, and to evaluate them in terms of educational administration. Descriptive survey model was used in the research. The data of the research were obtained from a total of 1,117 teachers working in public and private schools subjected…
ERIC Educational Resources Information Center
Ernst, Kelly; Hiebert, Bryan
2002-01-01
Presents a model of comprehensive guidance and counseling integrated within a business context. Concludes that using program evaluation to position counseling as a business with effective service products may enhance the long-term viability of comprehensive guidance and counseling programs. (Contains 48 references.) (GCP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Venkat; Cole, Wesley
This poster is based on the paper of the same name, presented at the IEEE Power & Energy Society General Meeting, July18, 2016. Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solarmore » modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions - native resolution (134 BAs), state-level, and NERC region level - and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.« less
NASA Astrophysics Data System (ADS)
Oblow, E. M.
1982-10-01
An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.
Habilomatis, George; Chaloulakou, Archontoula
2013-10-01
Recently, a branch of particulate matter research concerns on ultrafine particles found in the urban environment, which originate, to a significant extent, from traffic sources. In urban street canyons, dispersion of ultrafine particles affects pedestrian's short term exposure and resident's long term exposure as well. The aim of the present work is the development and the evaluation of a composite lattice Boltzmann model to study the dispersion of ultrafine particles, in urban street canyon microenvironment. The proposed model has the potential to penetrate into the physics of this complex system. In order to evaluate the model performance against suitable experimental data, ultrafine particles levels have been monitored on an hourly basis for a period of 35 days, in a street canyon, in Athens area. The results of the comparative analysis are quite satisfactory. Furthermore, our modeled results are in a good agreement with the results of other computational and experimental studies. This work is a first attempt to study the dispersion of an air pollutant by application of the lattice Boltzmann method. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruskauff, Greg; Marutzky, Sam
Model evaluation focused solely on the PIN STRIPE and MILK SHAKE underground nuclear tests’ contaminant boundaries (CBs) because they had the largest extent, uncertainty, and potential consequences. The CAMBRIC radionuclide migration experiment also had a relatively large CB, but because it was constrained by transport data (notably Well UE-5n), there was little uncertainty, and radioactive decay reduced concentrations before much migration could occur. Each evaluation target and the associated data-collection activity were assessed in turn to determine whether the new data support, or demonstrate conservatism of, the CB forecasts. The modeling team—in this case, the same team that developed themore » Frenchman Flat geologic, source term, and groundwater flow and transport models—analyzed the new data and presented the results to a PER committee. Existing site understanding and its representation in numerical groundwater flow and transport models was evaluated in light of the new data and the ability to proceed to the CR stage of long-term monitoring and institutional control.« less
Evaluative methodology for comprehensive water quality management planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, H. L.
Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.
A randomised approach for NARX model identification based on a multivariate Bernoulli distribution
NASA Astrophysics Data System (ADS)
Bianchi, F.; Falsone, A.; Prandini, M.; Piroddi, L.
2017-04-01
The identification of polynomial NARX models is typically performed by incremental model building techniques. These methods assess the importance of each regressor based on the evaluation of partial individual models, which may ultimately lead to erroneous model selections. A more robust assessment of the significance of a specific model term can be obtained by considering ensembles of models, as done by the RaMSS algorithm. In that context, the identification task is formulated in a probabilistic fashion and a Bernoulli distribution is employed to represent the probability that a regressor belongs to the target model. Then, samples of the model distribution are collected to gather reliable information to update it, until convergence to a specific model. The basic RaMSS algorithm employs multiple independent univariate Bernoulli distributions associated to the different candidate model terms, thus overlooking the correlations between different terms, which are typically important in the selection process. Here, a multivariate Bernoulli distribution is employed, in which the sampling of a given term is conditioned by the sampling of the others. The added complexity inherent in considering the regressor correlation properties is more than compensated by the achievable improvements in terms of accuracy of the model selection process.
NASA Astrophysics Data System (ADS)
Marques, G.; Fraga, C. C. S.; Medellin-Azuara, J.
2016-12-01
The expansion and operation of urban water supply systems under growing demands, hydrologic uncertainty and water scarcity requires a strategic combination of supply sources for reliability, reduced costs and improved operational flexibility. The design and operation of such portfolio of water supply sources involves integration of long and short term planning to determine what and when to expand, and how much to use of each supply source accounting for interest rates, economies of scale and hydrologic variability. This research presents an integrated methodology coupling dynamic programming optimization with quadratic programming to optimize the expansion (long term) and operations (short term) of multiple water supply alternatives. Lagrange Multipliers produced by the short-term model provide a signal about the marginal opportunity cost of expansion to the long-term model, in an iterative procedure. A simulation model hosts the water supply infrastructure and hydrologic conditions. Results allow (a) identification of trade offs between cost and reliability of different expansion paths and water use decisions; (b) evaluation of water transfers between urban supply systems; and (c) evaluation of potential gains by reducing water system losses as a portfolio component. The latter is critical in several developing countries where water supply system losses are high and often neglected in favor of more system expansion.
Numerical simulation of gender differences in a long-term microgravity exposure
NASA Astrophysics Data System (ADS)
Perez-Poch, Antoni
The objective of this work is to analyse and simulate gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairment which may put in jeopardy a long-term mission is also evaluated. Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numerical Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular architecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electricallike model of this control system, using inexpensive software development frameworks, and has been tested and validated with the available experimental data. Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobical exercise, and also thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Initial results are compatible with the existing data, and provide unique information regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions. More experimental work is needed to adjust some parameters of the model. This work may be seen as another contribution to a better understanding of the underlying processes involved for both women in man adaptation to long-term microgravity.
Regime-based evaluation of cloudiness in CMIP5 models
NASA Astrophysics Data System (ADS)
Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin
2017-01-01
The concept of cloud regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating in each grid cell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product [long-term average total cloud amount (TCA)], cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our results support previous findings that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is still not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer cloud observations evaluated against ISCCP like another model output. Lastly, contrasting cloud simulation performance against each model's equilibrium climate sensitivity in order to gain insight on whether good cloud simulation pairs with particular values of this parameter, yields no clear conclusions.
Short-term molecular profiles are a central component of strategies to model health effects of environmental chemicals. In this study, a 7 day mouse assay was used to evaluate transcriptomic and proliferative responses in the liver for a hepatocarcinogenic phthalate, di (2-ethylh...
Using Optimization to Improve Test Planning
2017-09-01
friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool for the test and... evaluation schedulers. 14. SUBJECT TERMS schedule optimization, test planning 15. NUMBER OF PAGES 223 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...make the input more user-friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool
Devriendt, Floris; Moldovan, Darie; Verbeke, Wouter
2018-03-01
Prescriptive analytics extends on predictive analytics by allowing to estimate an outcome in function of control variables, allowing as such to establish the required level of control variables for realizing a desired outcome. Uplift modeling is at the heart of prescriptive analytics and aims at estimating the net difference in an outcome resulting from a specific action or treatment that is applied. In this article, a structured and detailed literature survey on uplift modeling is provided by identifying and contrasting various groups of approaches. In addition, evaluation metrics for assessing the performance of uplift models are reviewed. An experimental evaluation on four real-world data sets provides further insight into their use. Uplift random forests are found to be consistently among the best performing techniques in terms of the Qini and Gini measures, although considerable variability in performance across the various data sets of the experiments is observed. In addition, uplift models are frequently observed to be unstable and display a strong variability in terms of performance across different folds in the cross-validation experimental setup. This potentially threatens their actual use for business applications. Moreover, it is found that the available evaluation metrics do not provide an intuitively understandable indication of the actual use and performance of a model. Specifically, existing evaluation metrics do not facilitate a comparison of uplift models and predictive models and evaluate performance either at an arbitrary cutoff or over the full spectrum of potential cutoffs. In conclusion, we highlight the instability of uplift models and the need for an application-oriented approach to assess uplift models as prime topics for further research.
Filardo, Giuseppe; Perdisa, Francesco; Gelinsky, Michael; Despang, Florian; Fini, Milena; Marcacci, Maurilio; Parrilli, Anna Paola; Roffi, Alice; Salamanna, Francesca; Sartori, Maria; Schütz, Kathleen; Kon, Elizaveta
2018-05-26
Current therapeutic strategies for osteochondral restoration showed a limited regenerative potential. In fact, to promote the growth of articular cartilage and subchondral bone is a real challenge, due to the different functional and anatomical properties. To this purpose, alginate is a promising biomaterial for a scaffold-based approach, claiming optimal biocompatibility and good chondrogenic potential. A previously developed mineralized alginate scaffold was investigated in terms of the ability to support osteochondral regeneration both in a large and medium size animal model. The results were evaluated macroscopically and by microtomography, histology, histomorphometry, and immunohistochemical analysis. No evidence of adverse or inflammatory reactions was observed in both models, but limited subchondral bone formation was present, together with a slow scaffold resorption time.The implantation of this biphasic alginate scaffold provided partial osteochondral regeneration in the animal model. Further studies are needed to evaluate possible improvement in terms of osteochondral tissue regeneration for this biomaterial.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karagiannis, Georgios; Lin, Guang
2014-02-15
Generalized polynomial chaos (gPC) expansions allow the representation of the solution of a stochastic system as a series of polynomial terms. The number of gPC terms increases dramatically with the dimension of the random input variables. When the number of the gPC terms is larger than that of the available samples, a scenario that often occurs if the evaluations of the system are expensive, the evaluation of the gPC expansion can be inaccurate due to over-fitting. We propose a fully Bayesian approach that allows for global recovery of the stochastic solution, both in spacial and random domains, by coupling Bayesianmore » model uncertainty and regularization regression methods. It allows the evaluation of the PC coefficients on a grid of spacial points via (1) Bayesian model average or (2) medial probability model, and their construction as functions on the spacial domain via spline interpolation. The former accounts the model uncertainty and provides Bayes-optimal predictions; while the latter, additionally, provides a sparse representation of the solution by evaluating the expansion on a subset of dominating gPC bases when represented as a gPC expansion. Moreover, the method quantifies the importance of the gPC bases through inclusion probabilities. We design an MCMC sampler that evaluates all the unknown quantities without the need of ad-hoc techniques. The proposed method is suitable for, but not restricted to, problems whose stochastic solution is sparse at the stochastic level with respect to the gPC bases while the deterministic solver involved is expensive. We demonstrate the good performance of the proposed method and make comparisons with others on 1D, 14D and 40D in random space elliptic stochastic partial differential equations.« less
A respiratory alert model for the Shenandoah Valley, Virginia, USA
NASA Astrophysics Data System (ADS)
Hondula, David M.; Davis, Robert E.; Knight, David B.; Sitka, Luke J.; Enfield, Kyle; Gawtry, Stephen B.; Stenger, Phillip J.; Deaton, Michael L.; Normile, Caroline P.; Lee, Temple R.
2013-01-01
Respiratory morbidity (particularly COPD and asthma) can be influenced by short-term weather fluctuations that affect air quality and lung function. We developed a model to evaluate meteorological conditions associated with respiratory hospital admissions in the Shenandoah Valley of Virginia, USA. We generated ensembles of classification trees based on six years of respiratory-related hospital admissions (64,620 cases) and a suite of 83 potential environmental predictor variables. As our goal was to identify short-term weather linkages to high admission periods, the dependent variable was formulated as a binary classification of five-day moving average respiratory admission departures from the seasonal mean value. Accounting for seasonality removed the long-term apparent inverse relationship between temperature and admissions. We generated eight total models specific to the northern and southern portions of the valley for each season. All eight models demonstrate predictive skill (mean odds ratio = 3.635) when evaluated using a randomization procedure. The predictor variables selected by the ensembling algorithm vary across models, and both meteorological and air quality variables are included. In general, the models indicate complex linkages between respiratory health and environmental conditions that may be difficult to identify using more traditional approaches.
Alpha1 LASSO data bundles Lamont, OK
Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Krishna, Bhargavi (ORCID:000000018828528X)
2016-08-03
A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input includes model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.
Evaluating synoptic systems in the CMIP5 climate models over the Australian region
NASA Astrophysics Data System (ADS)
Gibson, Peter B.; Uotila, Petteri; Perkins-Kirkpatrick, Sarah E.; Alexander, Lisa V.; Pitman, Andrew J.
2016-10-01
Climate models are our principal tool for generating the projections used to inform climate change policy. Our confidence in projections depends, in part, on how realistically they simulate present day climate and associated variability over a range of time scales. Traditionally, climate models are less commonly assessed at time scales relevant to daily weather systems. Here we explore the utility of a self-organizing maps (SOMs) procedure for evaluating the frequency, persistence and transitions of daily synoptic systems in the Australian region simulated by state-of-the-art global climate models. In terms of skill in simulating the climatological frequency of synoptic systems, large spread was observed between models. A positive association between all metrics was found, implying that relative skill in simulating the persistence and transitions of systems is related to skill in simulating the climatological frequency. Considering all models and metrics collectively, model performance was found to be related to model horizontal resolution but unrelated to vertical resolution or representation of the stratosphere. In terms of the SOM procedure, the timespan over which evaluation was performed had some influence on model performance skill measures, as did the number of circulation types examined. These findings have implications for selecting models most useful for future projections over the Australian region, particularly for projections related to synoptic scale processes and phenomena. More broadly, this study has demonstrated the utility of the SOMs procedure in providing a process-based evaluation of climate models.
An Adapted Porter Diamond Model for the Evaluation of Transnational Education Host Countries
ERIC Educational Resources Information Center
Tsiligiris, Vangelis
2018-01-01
Purpose: The purpose of this paper is to propose an adapted Porter Diamond Model (PDM) that can be used by transnational education (TNE) countries and institutions as an analytical framework for the strategic evaluation of TNE host countries in terms of attractiveness for exporting higher education. Design/methodology/approach: The study uses a…
Ackermann, Günter; Kirschner, Michael; Guggenbühl, Lisa; Abel, Bettina; Klohn, Axel; Mattig, Thomas
2015-01-01
Aims Since 2007, Health Promotion Switzerland has implemented a national priority program for a healthy body weight. This article provides insight into the methodological challenges and results of the program evaluation. Methods Evaluation of the long-term program required targeted monitoring and evaluation projects addressing different outcome levels. The evaluation was carried out according to the Swiss Model for Outcome Classification (SMOC), a model designed to classify the effects of health promotion and prevention efforts. Results The results presented in this article emphasize both content and methods. The national program successfully achieved outcomes on many different levels within complex societal structures. The evaluation system built around the SMOC enabled assessment of program progress and the development of key indicators. However, it is not possible to determine definitively to what extent the national program helped stabilize the prevalence of obesity in Switzerland. Conclusion The model has shown its utility in providing a basis for evaluation and monitoring of the national program. Continuous analysis of data from evaluation and monitoring has made it possible to check the plausibility of suspected causal relationships as well as to establish an overall perspective and assessment of effectiveness supported by a growing body of evidence. PMID:25765161
NASA Astrophysics Data System (ADS)
Wang, Wen-Chuan; Chau, Kwok-Wing; Cheng, Chun-Tian; Qiu, Lin
2009-08-01
SummaryDeveloping a hydrological forecasting model based on past records is crucial to effective hydropower reservoir management and scheduling. Traditionally, time series analysis and modeling is used for building mathematical models to generate hydrologic records in hydrology and water resources. Artificial intelligence (AI), as a branch of computer science, is capable of analyzing long-series and large-scale hydrological data. In recent years, it is one of front issues to apply AI technology to the hydrological forecasting modeling. In this paper, autoregressive moving-average (ARMA) models, artificial neural networks (ANNs) approaches, adaptive neural-based fuzzy inference system (ANFIS) techniques, genetic programming (GP) models and support vector machine (SVM) method are examined using the long-term observations of monthly river flow discharges. The four quantitative standard statistical performance evaluation measures, the coefficient of correlation ( R), Nash-Sutcliffe efficiency coefficient ( E), root mean squared error (RMSE), mean absolute percentage error (MAPE), are employed to evaluate the performances of various models developed. Two case study river sites are also provided to illustrate their respective performances. The results indicate that the best performance can be obtained by ANFIS, GP and SVM, in terms of different evaluation criteria during the training and validation phases.
WWTP dynamic disturbance modelling--an essential module for long-term benchmarking development.
Gernaey, K V; Rosen, C; Jeppsson, U
2006-01-01
Intensive use of the benchmark simulation model No. 1 (BSM1), a protocol for objective comparison of the effectiveness of control strategies in biological nitrogen removal activated sludge plants, has also revealed a number of limitations. Preliminary definitions of the long-term benchmark simulation model No. 1 (BSM1_LT) and the benchmark simulation model No. 2 (BSM2) have been made to extend BSM1 for evaluation of process monitoring methods and plant-wide control strategies, respectively. Influent-related disturbances for BSM1_LT/BSM2 are to be generated with a model, and this paper provides a general overview of the modelling methods used. Typical influent dynamic phenomena generated with the BSM1_LT/BSM2 influent disturbance model, including diurnal, weekend, seasonal and holiday effects, as well as rainfall, are illustrated with simulation results. As a result of the work described in this paper, a proposed influent model/file has been released to the benchmark developers for evaluation purposes. Pending this evaluation, a final BSM1_LT/BSM2 influent disturbance model definition is foreseen. Preliminary simulations with dynamic influent data generated by the influent disturbance model indicate that default BSM1 activated sludge plant control strategies will need extensions for BSM1_LT/BSM2 to efficiently handle 1 year of influent dynamics.
NASA Astrophysics Data System (ADS)
Maslova, I.; Ticlavilca, A. M.; McKee, M.
2012-12-01
There has been an increased interest in wavelet-based streamflow forecasting models in recent years. Often overlooked in this approach are the circularity assumptions of the wavelet transform. We propose a novel technique for minimizing the wavelet decomposition boundary condition effect to produce long-term, up to 12 months ahead, forecasts of streamflow. A simulation study is performed to evaluate the effects of different wavelet boundary rules using synthetic and real streamflow data. A hybrid wavelet-multivariate relevance vector machine model is developed for forecasting the streamflow in real-time for Yellowstone River, Uinta Basin, Utah, USA. The inputs of the model utilize only the past monthly streamflow records. They are decomposed into components formulated in terms of wavelet multiresolution analysis. It is shown that the model model accuracy can be increased by using the wavelet boundary rule introduced in this study. This long-term streamflow modeling and forecasting methodology would enable better decision-making and managing water availability risk.
Long short-term memory for speaker generalization in supervised speech separation
Chen, Jitong; Wang, DeLiang
2017-01-01
Speech separation can be formulated as learning to estimate a time-frequency mask from acoustic features extracted from noisy speech. For supervised speech separation, generalization to unseen noises and unseen speakers is a critical issue. Although deep neural networks (DNNs) have been successful in noise-independent speech separation, DNNs are limited in modeling a large number of speakers. To improve speaker generalization, a separation model based on long short-term memory (LSTM) is proposed, which naturally accounts for temporal dynamics of speech. Systematic evaluation shows that the proposed model substantially outperforms a DNN-based model on unseen speakers and unseen noises in terms of objective speech intelligibility. Analyzing LSTM internal representations reveals that LSTM captures long-term speech contexts. It is also found that the LSTM model is more advantageous for low-latency speech separation and it, without future frames, performs better than the DNN model with future frames. The proposed model represents an effective approach for speaker- and noise-independent speech separation. PMID:28679261
Alternative Modes of Evaluation and Their Application to Rural Development.
ERIC Educational Resources Information Center
Wetherill, G. Richard; Buttram, Joan L.
In order to "cut through the jargon of the multifaceted field of evaluative research", 21 evaluation models representing a range of possibilities were identified (via literature review) and compared in terms of purpose and five basic phases applicable to rural development. Evaluation was defined as "the systematic examination of a…
Prospective testing of Coulomb short-term earthquake forecasts
NASA Astrophysics Data System (ADS)
Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.
2009-12-01
Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of distance, time, and magnitude is needed. Third, earthquake catalogs contain errors in location and magnitude that may be corrected in later editions. One solution is to test models in “pseudo-prospective” mode (after catalog revision but without model adjustment). Again, appropriate for science but not for response. Hopefully, demonstrations of modeling success will stimulate improvements in earthquake detection.
To what extent can theory account for the findings of road safety evaluation studies?
Elvik, Rune
2004-09-01
This paper proposes a conceptual framework that can be used to assess to what extent the findings of road safety evaluation research make sense from a theoretical point of view. The effects of road safety measures are modelled as passing through two causal chains. One of these, termed the engineering effect, refers to the intended effects of a road safety measure on a set of risk factors related to accident occurrence or injury severity. The engineering effect of road safety measures is modelled in terms of nine basic risk factors, one or more of which any road safety measure needs to influence in order to have the intended effect on accidents or injuries. The other causal chain producing the effects of road safety measures is termed the behavioural effect, and refers to road user behavioural adaptations to road safety measures. The behavioural effect is related to the engineering effect, in the sense that certain properties of the engineering effect of a road safety measure influence the likelihood that behavioural adaptation will occur. The behavioural effect of a road safety measure is modelled in terms of six factors that influence the likelihood that behavioural adaptation will occur. The nine basic risk factors representing the engineering effect of a road safety measure, and the six factors influencing the likelihood of behavioural adaptation can be used as checklists in assessing whether or not the findings of road safety evaluation studies make sense from a theoretical point of view. At the current state of knowledge, a more stringent evaluation of the extent to which theory can explain the findings of road safety evaluation studies is, in most cases, not possible. Copyright 2003 Elsevier Ltd.
Estimating solar radiation for plant simulation models
NASA Technical Reports Server (NTRS)
Hodges, T.; French, V.; Leduc, S.
1985-01-01
Five algorithms producing daily solar radiation surrogates using daily temperatures and rainfall were evaluated using measured solar radiation data for seven U.S. locations. The algorithms were compared both in terms of accuracy of daily solar radiation estimates and terms of response when used in a plant growth simulation model (CERES-wheat). Requirements for accuracy of solar radiation for plant growth simulation models are discussed. One algorithm is recommended as being best suited for use in these models when neither measured nor satellite estimated solar radiation values are available.
Equifinality and process-based modelling
NASA Astrophysics Data System (ADS)
Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.
2017-12-01
Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.
Wang, Fugui; Mladenoff, David J; Forrester, Jodi A; Blanco, Juan A; Schelle, Robert M; Peckham, Scott D; Keough, Cindy; Lucash, Melissa S; Gower, Stith T
The effects of forest management on soil carbon (C) and nitrogen (N) dynamics vary by harvest type and species. We simulated long-term effects of bole-only harvesting of aspen (Populus tremuloides) on stand productivity and interaction of CN cycles with a multiple model approach. Five models, Biome-BGC, CENTURY, FORECAST, LANDIS-II with Century-based soil dynamics, and PnET-CN, were run for 350 yr with seven harvesting events on nutrient-poor, sandy soils representing northwestern Wisconsin, United States. Twenty CN state and flux variables were summarized from the models' outputs and statistically analyzed using ordination and variance analysis methods. The multiple models' averages suggest that bole-only harvest would not significantly affect long-term site productivity of aspen, though declines in soil organic matter and soil N were significant. Along with direct N removal by harvesting, extensive leaching after harvesting before canopy closure was another major cause of N depletion. These five models were notably different in output values of the 20 variables examined, although there were some similarities for certain variables. PnET-CN produced unique results for every variable, and CENTURY showed fewer outliers and similar temporal patterns to the mean of all models. In general, we demonstrated that when there are no site-specific data for fine-scale calibration and evaluation of a single model, the multiple model approach may be a more robust approach for long-term simulations. In addition, multimodeling may also improve the calibration and evaluation of an individual model.
Boyd, Kathleen Anne; Minnis, Helen; Donaldson, Julia; Brown, Kevin; Boyer, Nicole R S; McIntosh, Emma
2018-01-01
Introduction Children who have experienced abuse and neglect are at increased risk of mental and physical health problems throughout life. This places an enormous burden on individuals, families and society in terms of health services, education, social care and judiciary sectors. Evidence suggests that early intervention can mitigate the negative consequences of child maltreatment, exerting long-term positive effects on the health of maltreated children entering foster care. However, evidence on cost-effectiveness of such complex interventions is limited. This protocol describes the first economic evaluation of its kind in the UK. Methods and analysis An economic evaluation alongside the Best Services Trial (BeST?) has been prospectively designed to identify, measure and value key resource and outcome impacts arising from the New Orleans intervention model (NIM) (an infant mental health service) compared with case management (CM) (enhanced social work services as usual). A within-trial economic evaluation and long-term model from a National Health Service/Personal Social Service and a broader societal perspective will be undertaken alongside the National Institute for Health Research (NIHR)–Public Health Research Unit (PHRU)-funded randomised multicentre BeST?. BeST? aims to evaluate NIM compared with CM for maltreated children entering foster care in a UK context. Collection of Paediatric Quality of Life Inventory (PedsQL) and the recent mapping of PedsQL to EuroQol-5-Dimensions (EQ-5D) will facilitate the estimation of quality-adjusted life years specific to the infant population for a cost–utility analysis. Other effectiveness outcomes will be incorporated into a cost-effectiveness analysis (CEA) and cost-consequences analysis (CCA). A long-term economic model and multiple economic evaluation frameworks will provide decision-makers with a comprehensive, multiperspective guide regarding cost-effectiveness of NIM. The long-term population health economic model will be developed to synthesise trial data with routine linked data and key government sector parameters informed by literature. Methods guidance for population health economic evaluation will be adopted (lifetime horizon, 1.5% discount rate for costs and benefits, CCA framework, multisector perspective). Ethics and dissemination Ethics approval was obtained by the West of Scotland Ethics Committee. Results of the main trial and economic evaluation will be submitted for publication in a peer-reviewed journal as well as published in the peer-reviewed NIHR journals library (Public Health Research Programme). Trial registration number NCT02653716; Pre-results. PMID:29540420
NASA Astrophysics Data System (ADS)
Dakhlaoui, H.; Ruelland, D.; Tramblay, Y.; Bargaoui, Z.
2017-07-01
To evaluate the impact of climate change on water resources at the catchment scale, not only future projections of climate are necessary but also robust rainfall-runoff models that must be fairly reliable under changing climate conditions. The aim of this study was thus to assess the robustness of three conceptual rainfall-runoff models (GR4j, HBV and IHACRES) on five basins in northern Tunisia under long-term climate variability, in the light of available future climate scenarios for this region. The robustness of the models was evaluated using a differential split sample test based on a climate classification of the observation period that simultaneously accounted for precipitation and temperature conditions. The study catchments include the main hydrographical basins in northern Tunisia, which produce most of the surface water resources in the country. A 30-year period (1970-2000) was used to capture a wide range of hydro-climatic conditions. The calibration was based on the Kling-Gupta Efficiency (KGE) criterion, while model transferability was evaluated based on the Nash-Sutcliffe efficiency criterion and volume error. The three hydrological models were shown to behave similarly under climate variability. The models simulated the runoff pattern better when transferred to wetter and colder conditions than to drier and warmer ones. It was shown that their robustness became unacceptable when climate conditions involved a decrease of more than 25% in annual precipitation and an increase of more than +1.75 °C in annual mean temperatures. The reduction in model robustness may be partly due to the climate dependence of some parameters. When compared to precipitation and temperature projections in the region, the limits of transferability obtained in this study are generally respected for short and middle term. For long term projections under the most pessimistic emission gas scenarios, the limits of transferability are generally not respected, which may hamper the use of conceptual models for hydrological projections in northern Tunisia.
Mohiuddin, Syed
2014-08-01
Bipolar disorder (BD) is a chronic and relapsing mental illness with a considerable health-related and economic burden. The primary goal of pharmacotherapeutics for BD is to improve patients' well-being. The use of decision-analytic models is key in assessing the added value of the pharmacotherapeutics aimed at treating the illness, but concerns have been expressed about the appropriateness of different modelling techniques and about the transparency in the reporting of economic evaluations. This paper aimed to identify and critically appraise published model-based economic evaluations of pharmacotherapeutics in BD patients. A systematic review combining common terms for BD and economic evaluation was conducted in MEDLINE, EMBASE, PSYCINFO and ECONLIT. Studies identified were summarised and critically appraised in terms of the use of modelling technique, model structure and data sources. Considering the prognosis and management of BD, the possible benefits and limitations of each modelling technique are discussed. Fourteen studies were identified using model-based economic evaluations of pharmacotherapeutics in BD patients. Of these 14 studies, nine used Markov, three used discrete-event simulation (DES) and two used decision-tree models. Most of the studies (n = 11) did not include the rationale for the choice of modelling technique undertaken. Half of the studies did not include the risk of mortality. Surprisingly, no study considered the risk of having a mixed bipolar episode. This review identified various modelling issues that could potentially reduce the comparability of one pharmacotherapeutic intervention with another. Better use and reporting of the modelling techniques in the future studies are essential. DES modelling appears to be a flexible and comprehensive technique for evaluating the comparability of BD treatment options because of its greater flexibility of depicting the disease progression over time. However, depending on the research question, modelling techniques other than DES might also be appropriate in some cases.
You are such a bad child! Appraisals as mechanisms of parental negative and positive affect.
Gavita, Oana Alexandra; David, Daniel; DiGiuseppe, Raymond
2014-01-01
Although parent cognitions are considered important predictors that determine specific emotional reactions and parental practices, models on the cognitive strategies for regulating parental distress or positive emotions are not well developed. Our aim was to investigate the nature of cognitions involved in parental distress and satisfaction, in terms of their specificity (parental or general) and their processing levels (inferential or evaluative cognitions). We hypothesized that parent's specific evaluative cognitions will mediate the impact of more general and inferential cognitive structures on their affective reactions. We used bootstrapping procedures in order to test the mediation models proposed. Results obtained show indeed that rather specific evaluative parental cognitions are mediating the relationship between general cognitions and parental distress. In terms of the cognitive processing levels, it seems that when parents hold both low self-efficacy and parental negative global evaluations for the self/child, this adds significantly to their distress.
Regime-Based Evaluation of Cloudiness in CMIP5 Models
NASA Technical Reports Server (NTRS)
Jin, Daeho; Oraiopoulos, Lazaros; Lee, Dong Min
2016-01-01
The concept of Cloud Regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating for each gridcell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product (long-term average total cloud amount [TCA]), cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our findings support previous studies showing that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite their shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer (MODIS) cloud observations evaluated against ISCCP as if they were another model output. Lastly, cloud simulation performance is contrasted with each model's equilibrium climate sensitivity (ECS) in order to gain insight on whether good cloud simulation pairs with particular values of this parameter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, R.P.; Short, S.A.; McDonald, J.R.
1990-06-01
The Department of Energy (DOE) and the DOE Natural Phenomena Hazards Panel have developed uniform design and evaluation guidelines for protection against natural phenomena hazards at DOE sites throughout the United States. The goal of the guidelines is to assure that DOE facilities can withstand the effects of natural phenomena such as earthquakes, extreme winds, tornadoes, and flooding. The guidelines apply to both new facilities (design) and existing facilities (evaluation, modification, and upgrading). The intended audience is primarily the civil/structural or mechanical engineers conducting the design or evaluation of DOE facilities. The likelihood of occurrence of natural phenomena hazards atmore » each DOE site has been evaluated by the DOE Natural Phenomena Hazard Program. Probabilistic hazard models are available for earthquake, extreme wind/tornado, and flood. Alternatively, site organizations are encouraged to develop site-specific hazard models utilizing the most recent information and techniques available. In this document, performance goals and natural hazard levels are expressed in probabilistic terms, and design and evaluation procedures are presented in deterministic terms. Design/evaluation procedures conform closely to common standard practices so that the procedures will be easily understood by most engineers. Performance goals are expressed in terms of structure or equipment damage to the extent that: (1) the facility cannot function; (2) the facility would need to be replaced; or (3) personnel are endangered. 82 refs., 12 figs., 18 tabs.« less
ERIC Educational Resources Information Center
Kurt, Hakan
2014-01-01
The aim of this study is to evaluate biology teachers' attitudes and belief levels on classroom control in terms of teachers' sense of efficacy. The screening model was used in the study. The study group was comprised of 135 biology teachers. In this study, Teachers' Sense of Efficacy Scale (TSES) and The Attitudes and Beliefs on Classroom Control…
2002-03-01
source term. Several publications provided a thorough accounting of the accident, including “ Chernobyl Record” [Mould], and the NRC technical report...Report on the Accident at the Chernobyl Nuclear Power Station” [NUREG-1250]. The most comprehensive study of transport models to predict the...from the Chernobyl Accident: The ATMES Report” [Klug, et al.]. The Atmospheric Transport 5 Model Evaluation Study (ATMES) report used data
A note on evaluating model tidal currents against observations
NASA Astrophysics Data System (ADS)
Cummins, Patrick F.; Thupaki, Pramod
2018-01-01
The root-mean-square magnitude of the vector difference between modeled and observed tidal ellipses is a comprehensive metric to evaluate the representation of tidal currents in ocean models. A practical expression for this difference is given in terms of the harmonic constants that are routinely used to specify current ellipses for a given tidal constituent. The resulting metric is sensitive to differences in all four current ellipse parameters, including phase.
E. Gregory McPherson; Paula J. Peper
2012-01-01
This paper describes three long-term tree growth studies conducted to evaluate tree performance because repeated measurements of the same trees produce critical data for growth model calibration and validation. Several empirical and process-based approaches to modeling tree growth are reviewed. Modeling is more advanced in the fields of forestry and...
Applying Psychological Theories to Promote Long-Term Maintenance of Health Behaviors
Joseph, Rodney P.; Daniel, Casey L.; Thind, Herpreet; Benitez, Tanya J.; Pekmezi, Dori
2014-01-01
Behavioral health theory provides a framework for researchers to design, implement, and evaluate the effects of health promotion programs. However, limited research has examined theories used in interventions to promote long-term maintenance of health behaviors. The purpose of this review was to evaluate the available literature and identify prominent behavioral health theories used in intervention research to promote maintenance of health behaviors. We reviewed theories used in intervention research assessing long-term maintenance (≥ 6 months post-intervention) of physical activity, weight loss, and smoking cessation. Five prominent behavioral theories were referenced by the 34 studies included in the review: Self-Determination Theory, Theory of Planned Behavior, Social Cognitive Theory, Transtheoretical Model, and Social Ecological Model. Descriptions and examples of applications of these theories are provided. Implications for future research are discussed. PMID:28217036
Experimental models of tracheobronchial stenoses: a useful tool for evaluating airway stents.
Marquette, C H; Mensier, E; Copin, M C; Desmidt, A; Freitag, L; Witt, C; Petyt, L; Ramon, P
1995-09-01
Stent implantation is a conservative alternative to open operation for treating benign tracheobronchial strictures. Most of the presently available stents were primarily designed for endovascular use. Their respiratory use entails a risk of iatrogenic complications. From a scientific and from an ethical point of view these risks justify preclinical evaluation of new respiratory stents in experimental models of central airway stenoses. Therefore, an attempt was made to develop such models in piglets and adult minipigs. Tracheal stenoses were obtained by creating first a segmental tracheomalacia through extramucosal resection of cartilaginous arches. The fibrous component of the stenoses was then obtained through bronchoscopic application of a caustic agent causing progressive deep mucosal and submucosal injury. Stenoses of the main bronchi were created by topical application of the caustic agent only. These models demonstrated the typical features of benign fibromalacic tracheobronchial stenoses with constant recurrence after mechanical dilation. Preliminary experiments showed that short-term problems of tolerance of stent prototypes are easily demonstrable in these models. These experimental models, which simulate quite realistically human diseases, offer the opportunity to perfect new tracheobronchial stents specifically designed for respiratory use and to evaluate their long-term tolerance before their use in humans.
Evaluating short-term hydro-meteorological fluxes using GRACE-derived water storage changes
NASA Astrophysics Data System (ADS)
Eicker, A.; Jensen, L.; Springer, A.; Kusche, J.
2017-12-01
Atmospheric and terrestrial water budgets, which represent important boundary conditions for both climate modeling and hydrological studies, are linked by evapotranspiration (E) and precipitation (P). These fields are provided by numerical weather prediction models and atmospheric reanalyses such as ERA-Interim and MERRA-Land; yet, in particular the quality of E is still not well evaluated. Via the terrestrial water budget equation, water storage changes derived from products of the Gravity Recovery and Climate Experiment (GRACE) mission, combined with runoff (R) data can be used to assess the realism of atmospheric models. In this contribution we will investigate the closure of the water balance for short-term fluxes, i.e. the agreement of GRACE water storage changes with P-E-R flux time series from different (global and regional) atmospheric reanalyses, land surface models, as well as observation-based data sets. Missing river runoff observations will be extrapolated using the calibrated rainfall-runoff model GR2M. We will perform a global analysis and will additionally focus on selected river basins in West Africa. The investigations will be carried out for various temporal scales, focusing on short-term fluxes down to daily variations to be detected in daily GRACE time series.
Osteogenic efficacy of strontium hydroxyapatite micro-granules in osteoporotic rat model.
Chandran, Sunitha; Babu S, Suresh; Vs, Hari Krishnan; Varma, H K; John, Annie
2016-10-01
Excessive demineralization in osteoporotic bones impairs its self-regeneration potential following a defect/fracture and is of great concern among the aged population. In this context, implants with inherent osteogenic ability loaded with therapeutic ions like Strontium (Sr 2+ ) may bring forth promising outcomes. Micro-granular Strontium incorporated Hydroxyapatite scaffolds have been synthesized and in vivo osteogenic efficacy was evaluated in a long-term osteoporosis-induced aged (LOA) rat model. Micro-granules with improved surface area are anticipated to resorb faster and together with the inherent bioactive properties of Hydroxyapatite with the leaching of Strontium ions from the scaffold, osteoporotic bone healing may be promoted. Long-term osteoporosis-induced aged rat model was chosen to extrapolate the results to clinical osteoporotic condition in the aged. Micro-granular 10% Strontium incorporated Hydroxyapatite synthesized by wet precipitation method exhibited increased in vitro dissolution rate and inductively coupled plasma studies confirmed Strontium ion release of 0.01 mM, proving its therapeutic potential for osteoporotic applications. Wistar rats were induced to long-term osteoporosis-induced aged model by ovariectomy along with a prolonged induction period of 10 months. Thereafter, osteogenic efficacy of Strontium incorporated Hydroxyapatite micro-granules was evaluated in femoral bone defects in the long-term osteoporosis-induced aged model. Post eight weeks of implantation in vivo regeneration efficacy ratio was highest in the Strontium incorporated Hydroxyapatite implanted group (0.92 ± 0.04) compared to sham and Hydroxyapatite implanted group. Micro CT evaluation further substantiated the improved osteointegration of Strontium incorporated Hydroxyapatite implants from the density histograms. Thus, the therapeutical potential of micro-granular Strontium incorporated Hydroxyapatite scaffolds becomes relevant, especially as bone void fillers in osteoporotic cases of tumor resection or trauma. © The Author(s) 2016.
1982-09-01
L . Anderson, formerly of the Naval Ocean Research and Development Activity, resulted in many refinements of the model...8217- )",,",z• "• •. ." ’ ’-,’’, L ’.,.’’.v ’..’’ -" .,, -’., -,...’’ .. i• ’,• . , i ’• . \\• v ’.t ,. . ," . ,. ,,,,_,.-,.,..’,’ .. ’ .’ -,1.-, -4...Mathematics its limitations. Where applicable, rec- omnendat ions wil L be made regarding (U) The model is examined in terms of model usage
USDA-ARS?s Scientific Manuscript database
Sustainable intensification is an emerging model for agriculture designed to reconcile accelerating global demand for agricultural products with long-term environmental stewardship. Defined here as increasing agricultural production while maintaining or improving environmental quality, sustainable i...
ERIC Educational Resources Information Center
Aslan, Mecit; Saglam, Mustafa
2017-01-01
The aim of this research is to examine postgraduate theses on curriculum evaluation completed between the years 2006-2015 in Turkey in terms of various aspects such as university, year, curriculum which is evaluated, curriculum evaluation model, research method, design, sample type, data collection methods, data analysis technique. In order to…
Evaluation of Model Recognition for Grammar-Based Automatic 3d Building Model Reconstruction
NASA Astrophysics Data System (ADS)
Yu, Qian; Helmholz, Petra; Belton, David
2016-06-01
In recent years, 3D city models are in high demand by many public and private organisations, and the steadily growing capacity in both quality and quantity are increasing demand. The quality evaluation of these 3D models is a relevant issue both from the scientific and practical points of view. In this paper, we present a method for the quality evaluation of 3D building models which are reconstructed automatically from terrestrial laser scanning (TLS) data based on an attributed building grammar. The entire evaluation process has been performed in all the three dimensions in terms of completeness and correctness of the reconstruction. Six quality measures are introduced to apply on four datasets of reconstructed building models in order to describe the quality of the automatic reconstruction, and also are assessed on their validity from the evaluation point of view.
Evaluating Differential Effects Using Regression Interactions and Regression Mixture Models
ERIC Educational Resources Information Center
Van Horn, M. Lee; Jaki, Thomas; Masyn, Katherine; Howe, George; Feaster, Daniel J.; Lamont, Andrea E.; George, Melissa R. W.; Kim, Minjung
2015-01-01
Research increasingly emphasizes understanding differential effects. This article focuses on understanding regression mixture models, which are relatively new statistical methods for assessing differential effects by comparing results to using an interactive term in linear regression. The research questions which each model answers, their…
Application of Consider Covariance to the Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Lundberg, John B.
1996-01-01
The extended Kalman filter (EKF) is the basis for many applications of filtering theory to real-time problems where estimates of the state of a dynamical system are to be computed based upon some set of observations. The form of the EKF may vary somewhat from one application to another, but the fundamental principles are typically unchanged among these various applications. As is the case in many filtering applications, models of the dynamical system (differential equations describing the state variables) and models of the relationship between the observations and the state variables are created. These models typically employ a set of constants whose values are established my means of theory or experimental procedure. Since the estimates of the state are formed assuming that the models are perfect, any modeling errors will affect the accuracy of the computed estimates. Note that the modeling errors may be errors of commission (errors in terms included in the model) or omission (errors in terms excluded from the model). Consequently, it becomes imperative when evaluating the performance of real-time filters to evaluate the effect of modeling errors on the estimates of the state.
Irvine, Kathryn M.; Miller, Scott; Al-Chokhachy, Robert K.; Archer, Erik; Roper, Brett B.; Kershner, Jeffrey L.
2015-01-01
Conceptual models are an integral facet of long-term monitoring programs. Proposed linkages between drivers, stressors, and ecological indicators are identified within the conceptual model of most mandated programs. We empirically evaluate a conceptual model developed for a regional aquatic and riparian monitoring program using causal models (i.e., Bayesian path analysis). We assess whether data gathered for regional status and trend estimation can also provide insights on why a stream may deviate from reference conditions. We target the hypothesized causal pathways for how anthropogenic drivers of road density, percent grazing, and percent forest within a catchment affect instream biological condition. We found instream temperature and fine sediments in arid sites and only fine sediments in mesic sites accounted for a significant portion of the maximum possible variation explainable in biological condition among managed sites. However, the biological significance of the direct effects of anthropogenic drivers on instream temperature and fine sediments were minimal or not detected. Consequently, there was weak to no biological support for causal pathways related to anthropogenic drivers’ impact on biological condition. With weak biological and statistical effect sizes, ignoring environmental contextual variables and covariates that explain natural heterogeneity would have resulted in no evidence of human impacts on biological integrity in some instances. For programs targeting the effects of anthropogenic activities, it is imperative to identify both land use practices and mechanisms that have led to degraded conditions (i.e., moving beyond simple status and trend estimation). Our empirical evaluation of the conceptual model underpinning the long-term monitoring program provided an opportunity for learning and, consequently, we discuss survey design elements that require modification to achieve question driven monitoring, a necessary step in the practice of adaptive monitoring. We suspect our situation is not unique and many programs may suffer from the same inferential disconnect. Commonly, the survey design is optimized for robust estimates of regional status and trend detection and not necessarily to provide statistical inferences on the causal mechanisms outlined in the conceptual model, even though these relationships are typically used to justify and promote the long-term monitoring of a chosen ecological indicator. Our application demonstrates a process for empirical evaluation of conceptual models and exemplifies the need for such interim assessments in order for programs to evolve and persist.
A proposal of a renormalizable Nambu-Jona-Lasinio model
NASA Astrophysics Data System (ADS)
Cabo Montes de Oca, Alejandro
2018-03-01
A local and gauge invariant gauge field model including Nambu-Jona-Lasinio (NJL) and QCD Lagrangian terms in its action is introduced. Surprisingly, it becomes power counting renormalizable. This occurs thanks to the presence of action terms which modify the quark propagators, to become more decreasing that the Dirac one at large momenta in a Lee-Wick form, implying power counting renormalizability. The appearance of finite quark masses already in the tree approximation in the scheme is determined by the fact that the new action terms explicitly break chiral invariance. In this starting work we present the renormalized Feynman diagram expansion of the model and derive the formula for the degree of divergence of the diagrams. An explanation for the usual exclusion of the added Lagrangian terms is presented. In addition, the primitíve divergent graphs are identified. We start their evaluation by calculating the simpler contribution to the gluon polarization operator. The divergent and finite parts both result transverse as required by gauge invariance. The full evaluation of the various primitive divergences, which are required for completely defining the counterterm Feynman expansion will be considered in coming works, for further allowing to discuss the flavour symmetry breaking and unitarity.
Evaluation of Chemistry-Climate Model Results using Long-Term Satellite and Ground-Based Data
NASA Technical Reports Server (NTRS)
Stolarski, Richard S.
2005-01-01
Chemistry-climate models attempt to bring together our best knowledge of the key processes that govern the composition of the atmosphere and its response to changes in forcing. We test these models on a process by process basis by comparing model results to data from many sources. A more difficult task is testing the model response to changes. One way to do this is to use the natural and anthropogenic experiments that have been done on the atmosphere and are continuing to be done. These include the volcanic eruptions of El Chichon and Pinatubo, the solar cycle, and the injection of chlorine and bromine from CFCs and methyl bromide. The test of the model's response to these experiments is their ability to produce the long-term variations in ozone and the trace gases that affect ozone. We now have more than 25 years of satellite ozone data. We have more than 15 years of satellite and ground-based data of HC1, HN03, and many other gases. I will discuss the testing of models using long-term satellite data sets, long-term measurements from the Network for Detection of Stratospheric Change (NDSC) , long-term ground-based measurements of ozone.
TAMPA BAY MODEL EVALUATION AND ASSESSMENT
A long term goal of multimedia environmental management is to achieve sustainable ecological resources. Progress towards this goal rests on a foundation of science-based methods and data integrated into predictive multimedia, multi-stressor open architecture modeling systems. The...
Craven, S.W.; Peterson, J.T.; Freeman, Mary C.; Kwak, T.J.; Irwin, E.
2010-01-01
Modifications to stream hydrologic regimes can have a profound influence on the dynamics of their fish populations. Using hierarchical linear models, we examined the relations between flow regime and young-of-year fish density using fish sampling and discharge data from three different warmwater streams in Illinois, Alabama, and Georgia. We used an information theoretic approach to evaluate the relative support for models describing hypothesized influences of five flow regime components representing: short-term high and low flows; short-term flow stability; and long-term mean flows and flow stability on fish reproductive success during fish spawning and rearing periods. We also evaluated the influence of ten fish species traits on fish reproductive success. Species traits included spawning duration, reproductive strategy, egg incubation rate, swimming locomotion morphology, general habitat preference, and food habits. Model selection results indicated that young-of-year fish density was positively related to short-term high flows during the spawning period and negatively related to flow variability during the rearing period. However, the effect of the flow regime components varied substantially among species, but was related to species traits. The effect of short-term high flows on the reproductive success was lower for species that broadcast their eggs during spawning. Species with cruiser swimming locomotion morphologies (e.g., Micropterus) also were more vulnerable to variable flows during the rearing period. Our models provide insight into the conditions and timing of flows that influence the reproductive success of warmwater stream fishes and may guide decisions related to stream regulation and management. ?? 2010 US Government.
Source term evaluation model for high-level radioactive waste repository with decay chain build-up.
Chopra, Manish; Sunny, Faby; Oza, R B
2016-09-18
A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Lumley, John L.
1991-01-01
Recently, several second order closure models have been proposed for closing the second moment equations, in which the velocity-pressure gradient (and scalar-pressure gradient) tensor and the dissipation rate tensor are two of the most important terms. In the literature, these correlation tensors are usually decomposed into a so called rapid term and a return-to-isotropy term. Models of these terms have been used in global flow calculations together with other modeled terms. However, their individual behavior in different flows have not been fully examined because they are un-measurable in the laboratory. Recently, the development of direct numerical simulation (DNS) of turbulence has given us the opportunity to do this kind of study. With the direct numerical simulation, we may use the solution to exactly calculate the values of these correlation terms and then directly compare them with the values from their modeled formulations (models). Here, we make direct comparisons of five representative rapid models and eight return-to-isotropy models using the DNS data of forty five homogeneous flows which were done by Rogers et al. (1986) and Lee et al. (1985). The purpose of these direct comparisons is to explore the performance of these models in different flows and identify the ones which give the best performance. The modeling procedure, model constraints, and the various evaluated models are described. The detailed results of the direct comparisons are discussed, and a few concluding remarks on turbulence models are given.
NASA Astrophysics Data System (ADS)
Campo-Bescós, M. A.; Flores-Cervantes, J. H.; Bras, R. L.; Casalí, J.; Giráldez, J. V.
2013-12-01
large fraction of soil erosion in temperate climate systems proceeds from gully headcut growth processes. Nevertheless, headcut retreat is not well understood. Few erosion models include gully headcut growth processes, and none of the existing headcut retreat models have been tested against long-term retreat rate estimates. In this work the headcut retreat resulting from plunge pool erosion in the Channel Hillslope Integrated Landscape Development (CHILD) model is calibrated and compared to long-term evolution measurements of six gullies at the Bardenas Reales, northeast Spain. The headcut retreat module of CHILD was calibrated by adjusting the shape factor parameter to fit the observed retreat and volumetric soil loss of one gully during a 36 year period, using reported and collected field data to parameterize the rest of the model. To test the calibrated model, estimates by CHILD were compared to observations of headcut retreat from five other neighboring gullies. The differences in volumetric soil loss rates between the simulations and observations were less than 0.05 m3 yr-1, on average, with standard deviations smaller than 0.35 m3 yr-1. These results are the first evaluation of the headcut retreat module implemented in CHILD with a field data set. These results also show the usefulness of the model as a tool for simulating long-term volumetric gully evolution due to plunge pool erosion.
Tomini, F; Prinzen, F; van Asselt, A D I
2016-12-01
Cardiac resynchronization therapy with a biventricular pacemaker (CRT-P) is an effective treatment for dyssynchronous heart failure (DHF). Adding an implantable cardioverter defibrillator (CRT-D) may further reduce the risk of sudden cardiac death (SCD). However, if the majority of patients do not require shock therapy, the cost-effectiveness ratio of CRT-D compared to CRT-P may be high. The objective of this study was to systematically review decision models evaluating the cost-effectiveness of CRT-D for patients with DHF, compare the structure and inputs of these models and identify the main factors influencing the ICERs for CRT-D. A comprehensive search strategy of Medline (Ovid), Embase (Ovid) and EconLit identified eight cost-effectiveness models evaluating CRT-D against optimal pharmacological therapy (OPT) and/or CRT-P. The selected economic studies differed in terms of model structure, treatment path, time horizons, and sources of efficacy data. CRT-D was found cost-effective when compared to OPT but its cost-effectiveness became questionable when compared to CRT-P. Cost-effectiveness of CRT-D may increase depending on improvement of all-cause mortality rates and HF mortality rates in patients who receive CRT-D, costs of the device, and battery life. In particular, future studies need to investigate longer-term mortality rates and identify CRT-P patients that will gain the most, in terms of life expectancy, from being treated with a CRT-D.
Evaluating Clouds in Long-Term Cloud-Resolving Model Simulations with Observational Data
NASA Technical Reports Server (NTRS)
Zeng, Xiping; Tao, Wei-Kuo; Zhang, Minghua; Peters-Lidard, Christa; Lang, Stephen; Simpson, Joanne; Kumar, Sujay; Xie, Shaocheng; Eastman, Joseph L.; Shie, Chung-Lin;
2006-01-01
Two 20-day, continental midlatitude cases are simulated with a three-dimensional (3D) cloud-resolving model (CRM) and compared to Atmospheric Radiation Measurement (ARM) data. This evaluation of long-term cloud-resolving model simulations focuses on the evaluation of clouds and surface fluxes. All numerical experiments, as compared to observations, simulate surface precipitation well but over-predict clouds, especially in the upper troposphere. The sensitivity of cloud properties to dimensionality and other factors is studied to isolate the origins of the over prediction of clouds. Due to the difference in buoyancy damping between 2D and 3D models, surface precipitation fluctuates rapidly with time, and spurious dehumidification occurs near the tropopause in the 2D CRM. Surface fluxes from a land data assimilation system are compared with ARM observations. They are used in place of the ARM surface fluxes to test the sensitivity of simulated clouds to surface fluxes. Summertime simulations show that surface fluxes from the assimilation system bring about a better simulation of diurnal cloud variation in the lower troposphere.
Evaluation of Surface Flux Parameterizations with Long-Term ARM Observations
Liu, Gang; Liu, Yangang; Endo, Satoshi
2013-02-01
Surface momentum, sensible heat, and latent heat fluxes are critical for atmospheric processes such as clouds and precipitation, and are parameterized in a variety of models ranging from cloud-resolving models to large-scale weather and climate models. However, direct evaluation of the parameterization schemes for these surface fluxes is rare due to limited observations. This study takes advantage of the long-term observations of surface fluxes collected at the Southern Great Plains site by the Department of Energy Atmospheric Radiation Measurement program to evaluate the six surface flux parameterization schemes commonly used in the Weather Research and Forecasting (WRF) model and threemore » U.S. general circulation models (GCMs). The unprecedented 7-yr-long measurements by the eddy correlation (EC) and energy balance Bowen ratio (EBBR) methods permit statistical evaluation of all six parameterizations under a variety of stability conditions, diurnal cycles, and seasonal variations. The statistical analyses show that the momentum flux parameterization agrees best with the EC observations, followed by latent heat flux, sensible heat flux, and evaporation ratio/Bowen ratio. The overall performance of the parameterizations depends on atmospheric stability, being best under neutral stratification and deteriorating toward both more stable and more unstable conditions. Further diagnostic analysis reveals that in addition to the parameterization schemes themselves, the discrepancies between observed and parameterized sensible and latent heat fluxes may stem from inadequate use of input variables such as surface temperature, moisture availability, and roughness length. The results demonstrate the need for improving the land surface models and measurements of surface properties, which would permit the evaluation of full land surface models.« less
Know your community: Model applications in field research
USDA-ARS?s Scientific Manuscript database
The focus of this community is to promote the application of cropping or range system models in field research to help evaluate and develop optimum agricultural systems and management to achieve long-term economic and environmental sustainability under a changing climate. Model applications to a var...
USDA-ARS?s Scientific Manuscript database
The importance of measurement uncertainty in terms of calculation of model evaluation error statistics has been recently stated in the literature. The impact of measurement uncertainty on calibration results indicates the potential vague zone in the field of watershed modeling where the assumption ...
Opportunities of probabilistic flood loss models
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno
2016-04-01
Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Part 2 of a Computational Study of a Drop-Laden Mixing Layer
NASA Technical Reports Server (NTRS)
Okongo, Nora; Bellan, Josette
2004-01-01
This second of three reports on a computational study of a mixing layer laden with evaporating liquid drops presents the evaluation of Large Eddy Simulation (LES) models. The LES models were evaluated on an existing database that had been generated using Direct Numerical Simulation (DNS). The DNS method and the database are described in the first report of this series, Part 1 of a Computational Study of a Drop-Laden Mixing Layer (NPO-30719), NASA Tech Briefs, Vol. 28, No.7 (July 2004), page 59. The LES equations, which are derived by applying a spatial filter to the DNS set, govern the evolution of the larger scales of the flow and can therefore be solved on a coarser grid. Consistent with the reduction in grid points, the DNS drops would be represented by fewer drops, called computational drops in the LES context. The LES equations contain terms that cannot be directly computed on the coarser grid and that must instead be modeled. Two types of models are necessary: (1) those for the filtered source terms representing the effects of drops on the filtered flow field and (2) those for the sub-grid scale (SGS) fluxes arising from filtering the convective terms in the DNS equations. All of the filtered-sourceterm models that were developed were found to overestimate the filtered source terms. For modeling the SGS fluxes, constant-coefficient Smagorinsky, gradient, and scale-similarity models were assessed and calibrated on the DNS database. The Smagorinsky model correlated poorly with the SGS fluxes, whereas the gradient and scale-similarity models were well correlated with the SGS quantities that they represented.
ERIC Educational Resources Information Center
Chen, Chung-Yang; Chang, Huiju; Hsu, Wen-Chin; Sheen, Gwo-Ji
2017-01-01
This paper proposes a training model for raters, with the goal to improve the intra- and inter-consistency of evaluation quality for higher education curricula. The model, termed the learning, behaviour and reaction (LBR) circular training model, is an interdisciplinary application from the business and organisational training domain. The…
Ge Sun; Jianbiao Lu; Steven G. McNulty; James M. Vose; Devendra M. Amayta
2006-01-01
A clear understanding of the basic hydrologic processes is needed to restore and manage watersheds across the diverse physiologic gradients in the Southeastern U.S. We evaluated a physically based, spatially distributed watershed hydrologic model called MIKE SHE/MIKE 11 to evaluate disturbance impacts on water use and yield across the region. Long-term forest...
Norman, J C; McGee, M G; Fuqua, J M; Igo, S R; Turner, S A; Sterling, R; Urrutia, C O; Frazier, O H; Clay, W C; Chambers, J A
1983-02-01
A long-term, implantable, electrically actuated left ventricular assist system (THI/Gould LVAS) is being developed and characterized in vitro and in vivo for utilization in patients with end-stage heart disease. This system consists of five major components: a long-term, implantable blood pump (THI E-type ALVAD); an electrical-mechanical energy converter (Gould Model V); a control unit with batteries; a volume compensation system; and an external power supply and monitoring unit. Two of these components (blood pump and electrical-mechanical energy converter) have been integrated, and are undergoing chronic in vivo evaluations in calves. Thus far, 44 pneumatically and electrically actuated THI/Gould LVAS evaluations have been performed. This experience has resulted in greater than 6.5 years of actuation in vivo, with durations exceeding 1 year. System in vivo performance in terms of durability, mechanical reliability, hemodynamic effectiveness, and biocompatibility has been satisfactory. Demonstration of long-term (2-year) effectiveness in supporting the circulation is the ultimate goal.
Safari, Saeed; Baratloo, Alireza; Hashemi, Behrooz; Rahmati, Farhad; Forouzanfar, Mohammad Mehdi; Motamedi, Maryam; Mirmohseni, Ladan
2016-01-01
Determining etiologic causes and prognosis can significantly improve management of syncope patients. The present study aimed to compare the values of San Francisco, Osservatorio Epidemiologico sulla Sincope nel Lazio (OESIL), Boston, and Risk Stratification of Syncope in the Emergency Department (ROSE) score clinical decision rules in predicting the short-term serious outcome of syncope patients. The present diagnostic accuracy study with 1-week follow-up was designed to evaluate the predictive values of the four mentioned clinical decision rules. Screening performance characteristics of each model in predicting mortality, myocardial infarction (MI), and cerebrovascular accidents (CVAs) were calculated and compared. To evaluate the value of each aforementioned model in predicting the outcome, sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio were calculated and receiver-operating curve (ROC) curve analysis was done. A total of 187 patients (mean age: 64.2 ± 17.2 years) were enrolled in the study. Mortality, MI, and CVA were seen in 19 (10.2%), 12 (6.4%), and 36 (19.2%) patients, respectively. Area under the ROC curve for OESIL, San Francisco, Boston, and ROSE models in prediction the risk of 1-week mortality, MI, and CVA was in the 30-70% range, with no significant difference among models ( P > 0.05). The pooled model did not show higher accuracy in prediction of mortality, MI, and CVA compared to others ( P > 0.05). This study revealed the weakness of all four evaluated models in predicting short-term serious outcome of syncope patients referred to the emergency department without any significant advantage for one among others.
Evaluation of Cross-Cultural Training Programs for International Students from East Europe
ERIC Educational Resources Information Center
Kovacova, Michaela; Eckert, Stefan
2010-01-01
This paper presents a comparative evaluation of didactic and experiential training in Germany carried out on a sample of international university students from Eastern Europe. The long-term evaluation was conducted by using a quasi-experimental design with a control group according to Kirkpatrick's model including three steps: reaction, learning…
Dixit, Prakash N; Telleria, Roberto
2015-04-01
Inter-annual and seasonal variability in climatic parameters, most importantly rainfall, have potential to cause climate-induced risk in long-term crop production. Short-term field studies do not capture the full nature of such risk and the extent to which modifications to crop, soil and water management recommendations may be made to mitigate the extent of such risk. Crop modeling studies driven by long-term daily weather data can predict the impact of climate-induced risk on crop growth and yield however, the availability of long-term daily weather data can present serious constraints to the use of crop models. To tackle this constraint, two weather generators namely, LARS-WG and MarkSim, were evaluated in order to assess their capabilities of reproducing frequency distributions, means, variances, dry spell and wet chains of observed daily precipitation, maximum and minimum temperature, and solar radiation for the eight locations across cropping areas of Northern Syria and Lebanon. Further, the application of generated long-term daily weather data, with both weather generators, in simulating barley growth and yield was also evaluated. We found that overall LARS-WG performed better than MarkSim in generating daily weather parameters and in 50 years continuous simulation of barley growth and yield. Our findings suggest that LARS-WG does not necessarily require long-term e.g., >30 years observed weather data for calibration as generated results proved to be satisfactory with >10 years of observed data except in area with higher altitude. Evaluating these weather generators and the ability of generated weather data to perform long-term simulation of crop growth and yield is an important first step to assess the impact of future climate on yields, and to identify promising technologies to make agricultural systems more resilient in the given region. Copyright © 2015 Elsevier B.V. All rights reserved.
Stenberg, Nicola; Furness, Penny J
2017-03-01
The outcomes of self-management interventions are commonly assessed using quantitative measurement tools, and few studies ask people with long-term conditions to explain, in their own words, what aspects of the intervention they valued. In this Grounded Theory study, a Health Trainers service in the north of England was evaluated based on interviews with eight service-users. Open, focused, and theoretical coding led to the development of a preliminary model explaining participants' experiences and perceived impact of the service. The model reflects the findings that living well with a long-term condition encompassed social connectedness, changed identities, acceptance, and self-care. Health trainers performed four related roles that were perceived to contribute to these outcomes: conceptualizer, connector, coach, and champion. The evaluation contributes a grounded theoretical understanding of a personalized self-management intervention that emphasizes the benefits of a holistic approach to enable cognitive, behavioral, emotional, and social adjustments.
Influence Diffusion Model in Text-Based Communication
NASA Astrophysics Data System (ADS)
Matsumura, Naohiro; Ohsawa, Yukio; Ishizuka, Mitsuru
Business people, especially marketing researchers, are keen to understand peoples' potential sense of value to create fascinating topics stimulating peoples' interest. In this paper, we aim at finding influential people, comments, and terms contributing the discovery of such topics. For this purpose, we propose an Influence Diffusion Model in text-based communication, where the influence of people, comments, and terms are defined as the degree of text-based relevance of messages. We apply this model to Bulletin Board Service(BBS) on the Internet, and present our discoveries on experimental evaluations.
A definitional framework for the human/biometric sensor interaction model
NASA Astrophysics Data System (ADS)
Elliott, Stephen J.; Kukula, Eric P.
2010-04-01
Existing definitions for biometric testing and evaluation do not fully explain errors in a biometric system. This paper provides a definitional framework for the Human Biometric-Sensor Interaction (HBSI) model. This paper proposes six new definitions based around two classifications of presentations, erroneous and correct. The new terms are: defective interaction (DI), concealed interaction (CI), false interaction (FI), failure to detect (FTD), failure to extract (FTX), and successfully acquired samples (SAS). As with all definitions, the new terms require a modification to the general biometric model developed by Mansfield and Wayman [1].
ERIC Educational Resources Information Center
Thorn, Annabel S. C.; Gathercole, Susan E.; Frankish, Clive R.
2005-01-01
The impact of four long-term knowledge variables on serial recall accuracy was investigated. Serial recall was tested for high and low frequency words and high and low phonotactic frequency nonwords in 2 groups: monolingual English speakers and French-English bilinguals. For both groups the recall advantage for words over nonwords reflected more…
Fronts and precipitation in CMIP5 models for the austral winter of the Southern Hemisphere
NASA Astrophysics Data System (ADS)
Blázquez, Josefina; Solman, Silvina A.
2018-04-01
Wintertime fronts climatology and the relationship between fronts and precipitation as depicted by a group of CMIP5 models are evaluated over the Southern Hemisphere (SH). The frontal activity is represented by an index that takes into account the vorticity, the gradient of temperature and the specific humidity at the 850 hPa level. ERA-Interim reanalysis and GPCP datasets are used to assess the performance of the models in the present climate. Overall, it is found that the models can reproduce adequately the main features of frontal activity and front frequency over the SH. The total precipitation is overestimated in most of the models, especially the maximum values over the mid latitudes. This overestimation could be related to the high values of precipitation frequency that are identified in some of the models evaluated. The relationship between fronts and precipitation has also been evaluated in terms of both frequency of frontal precipitation and percentage of precipitation due to fronts. In general terms, the models overestimate the proportion between frontal and total precipitation. In contrast with frequency of total precipitation, the frequency of frontal precipitation is well reproduced by the models, with the higher values located at the mid latitudes. The results suggest that models represent very well the dynamic forcing (fronts) and the frequency of frontal precipitation, though the amount of precipitation due to fronts is overestimated.
Simplified ISCCP cloud regimes for evaluating cloudiness in CMIP5 models
NASA Astrophysics Data System (ADS)
Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin
2017-01-01
We take advantage of ISCCP simulator data available for many models that participated in CMIP5, in order to introduce a framework for comparing model cloud output with corresponding ISCCP observations based on the cloud regime (CR) concept. Simplified global CRs are employed derived from the co-variations of three variables, namely cloud optical thickness, cloud top pressure and cloud fraction ( τ, p c , CF). Following evaluation criteria established in a companion paper of ours (Jin et al. 2016), we assess model cloud simulation performance based on how well the simplified CRs are simulated in terms of similarity of centroids, global values and map correlations of relative-frequency-of-occurrence, and long-term total cloud amounts. Mirroring prior results, modeled clouds tend to be too optically thick and not as extensive as in observations. CRs with high-altitude clouds from storm activity are not as well simulated here compared to the previous study, but other regimes containing near-overcast low clouds show improvement. Models that have performed well in the companion paper against CRs defined by joint τ- p c histograms distinguish themselves again here, but improvements for previously underperforming models are also seen. Averaging across models does not yield a drastically better picture, except for cloud geographical locations. Cloud evaluation with simplified regimes seems thus more forgiving than that using histogram-based CRs while still strict enough to reveal model weaknesses.
In Search of the Elusive ADDIE Model.
ERIC Educational Resources Information Center
Molenda, Michael
2003-01-01
Discusses the origin of the ADDIE model of instructional design and concludes that the term came into use by word of mouth as a label for the whole family of systematic instructional development models. Examines the underlying ideas behind the acronym analysis, design, development, implementation, and evaluation. (Author/LRW)
Criteria for Reviewing District Competency Tests.
ERIC Educational Resources Information Center
Herman, Joan L.
A formative evaluation minimum competency test model is examined. The model systematically uses assessment information to support and facilitate program improvement. In terms of the model, four inter-related qualities are essential for a sound testing program. The content validity perspective looks at how well the district has defined competency…
Toxicity evaluation and prediction of toxic chemicals on activated sludge system.
Cai, Bijing; Xie, Li; Yang, Dianhai; Arcangeli, Jean-Pierre
2010-05-15
The gaps of data for evaluating toxicity of new or overloaded organic chemicals on activated sludge system resulted in the requirements for methodology of toxicity estimation. In this study, 24 aromatic chemicals typically existed in the industrial wastewater were selected and classified into three groups of benzenes, phenols and anilines. Their toxicity on activated sludge was then investigated. Two indexes of IC(50-M) and IC(50-S) were determined respectively from the respiration rates of activated sludge with different toxicant concentration at mid-term (24h) and short-term (30min) time intervals. Experimental results showed that the group of benzenes was the most toxic, followed by the groups of phenols and anilines. The values of IC(50-M) of the tested chemicals were higher than those of IC(50-S). In addition, quantitative structure-activity relationships (QSARs) models developed from IC(50-M) were more stable and accurate than those of IC(50-S). The multiple linear models based on molecular descriptors and K(ow) presented better reliability than single linear models based on K(ow). Among these molecular descriptors, E(lumo) was the most important impact factor for evaluation of mid-term toxicity. Copyright (c) 2009 Elsevier B.V. All rights reserved.
Cook, Sarah F; Roberts, Jessica K; Samiee-Zafarghandy, Samira; Stockmann, Chris; King, Amber D; Deutsch, Nina; Williams, Elaine F; Allegaert, Karel; Wilkins, Diana G; Sherwin, Catherine M T; van den Anker, John N
2016-01-01
The aims of this study were to develop a population pharmacokinetic model for intravenous paracetamol in preterm and term neonates and to assess the generalizability of the model by testing its predictive performance in an external dataset. Nonlinear mixed-effects models were constructed from paracetamol concentration-time data in NONMEM 7.2. Potential covariates included body weight, gestational age, postnatal age, postmenstrual age, sex, race, total bilirubin, and estimated glomerular filtration rate. An external dataset was used to test the predictive performance of the model through calculation of bias, precision, and normalized prediction distribution errors. The model-building dataset included 260 observations from 35 neonates with a mean gestational age of 33.6 weeks [standard deviation (SD) 6.6]. Data were well-described by a one-compartment model with first-order elimination. Weight predicted paracetamol clearance and volume of distribution, which were estimated as 0.348 L/h (5.5 % relative standard error; 30.8 % coefficient of variation) and 2.46 L (3.5 % relative standard error; 14.3 % coefficient of variation), respectively, at the mean subject weight of 2.30 kg. An external evaluation was performed on an independent dataset that included 436 observations from 60 neonates with a mean gestational age of 35.6 weeks (SD 4.3). The median prediction error was 10.1 % [95 % confidence interval (CI) 6.1-14.3] and the median absolute prediction error was 25.3 % (95 % CI 23.1-28.1). Weight predicted intravenous paracetamol pharmacokinetics in neonates ranging from extreme preterm to full-term gestational status. External evaluation suggested that these findings should be generalizable to other similar patient populations.
Cook, Sarah F.; Roberts, Jessica K.; Samiee-Zafarghandy, Samira; Stockmann, Chris; King, Amber D.; Deutsch, Nina; Williams, Elaine F.; Allegaert, Karel; Sherwin, Catherine M. T.; van den Anker, John N.
2017-01-01
Objectives The aims of this study were to develop a population pharmacokinetic model for intravenous paracetamol in preterm and term neonates and to assess the generalizability of the model by testing its predictive performance in an external dataset. Methods Nonlinear mixed-effects models were constructed from paracetamol concentration–time data in NONMEM 7.2. Potential covariates included body weight, gestational age, postnatal age, postmenstrual age, sex, race, total bilirubin, and estimated glomerular filtration rate. An external dataset was used to test the predictive performance of the model through calculation of bias, precision, and normalized prediction distribution errors. Results The model-building dataset included 260 observations from 35 neonates with a mean gestational age of 33.6 weeks [standard deviation (SD) 6.6]. Data were well-described by a one-compartment model with first-order elimination. Weight predicted paracetamol clearance and volume of distribution, which were estimated as 0.348 L/h (5.5 % relative standard error; 30.8 % coefficient of variation) and 2.46 L (3.5 % relative standard error; 14.3 % coefficient of variation), respectively, at the mean subject weight of 2.30 kg. An external evaluation was performed on an independent dataset that included 436 observations from 60 neonates with a mean gestational age of 35.6 weeks (SD 4.3). The median prediction error was 10.1 % [95 % confidence interval (CI) 6.1–14.3] and the median absolute prediction error was 25.3 % (95 % CI 23.1–28.1). Conclusions Weight predicted intravenous paracetamol pharmacokinetics in neonates ranging from extreme preterm to full-term gestational status. External evaluation suggested that these findings should be generalizable to other similar patient populations. PMID:26201306
Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith
2013-01-01
Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts.
Combat Identification Systems COMO Integrated Air Defense Model Evaluation (CISE) Study
1989-02-01
use K or IR , whichever one applies) E-6 CAA-SR-89- 3 Subroutine PDECLR 1/21/88 Before label 1000 Insert: IF (IR.GT.10) IR a 10 These changes were made...Internal Distribution: Unclassified Library 2 F-2 CAA-SR-89- 3 GLOSSARY 1. ABBREVIATIONS, ACRONYMS, AND SHORT TERMS ADM2 Air Defense Models Modification...STUDY REPORT ’ , CAA-Sn-89- 3 i , .- CD o COMBAT IDENTIFICATION SYSTEMS N COMO INTEGRATED AIR DEFENSE MODEL EVALUATION (CISE) STUDY FEBRUARY 1989
Deidda, Manuela; Boyd, Kathleen Anne; Minnis, Helen; Donaldson, Julia; Brown, Kevin; Boyer, Nicole R S; McIntosh, Emma
2018-03-14
Children who have experienced abuse and neglect are at increased risk of mental and physical health problems throughout life. This places an enormous burden on individuals, families and society in terms of health services, education, social care and judiciary sectors. Evidence suggests that early intervention can mitigate the negative consequences of child maltreatment, exerting long-term positive effects on the health of maltreated children entering foster care. However, evidence on cost-effectiveness of such complex interventions is limited. This protocol describes the first economic evaluation of its kind in the UK. An economic evaluation alongside the Best Services Trial (BeST?) has been prospectively designed to identify, measure and value key resource and outcome impacts arising from the New Orleans intervention model (NIM) (an infant mental health service) compared with case management (CM) (enhanced social work services as usual). A within-trial economic evaluation and long-term model from a National Health Service/Personal Social Service and a broader societal perspective will be undertaken alongside the National Institute for Health Research (NIHR)-Public Health Research Unit (PHRU)-funded randomised multicentre BeST?. BeST? aims to evaluate NIM compared with CM for maltreated children entering foster care in a UK context. Collection of Paediatric Quality of Life Inventory (PedsQL) and the recent mapping of PedsQL to EuroQol-5-Dimensions (EQ-5D) will facilitate the estimation of quality-adjusted life years specific to the infant population for a cost-utility analysis. Other effectiveness outcomes will be incorporated into a cost-effectiveness analysis (CEA) and cost-consequences analysis (CCA). A long-term economic model and multiple economic evaluation frameworks will provide decision-makers with a comprehensive, multiperspective guide regarding cost-effectiveness of NIM. The long-term population health economic model will be developed to synthesise trial data with routine linked data and key government sector parameters informed by literature. Methods guidance for population health economic evaluation will be adopted (lifetime horizon, 1.5% discount rate for costs and benefits, CCA framework, multisector perspective). Ethics approval was obtained by the West of Scotland Ethics Committee. Results of the main trial and economic evaluation will be submitted for publication in a peer-reviewed journal as well as published in the peer-reviewed NIHR journals library (Public Health Research Programme). NCT02653716; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Radziukynas, V.; Klementavičius, A.
2016-04-01
The paper analyses the performance results of the recently developed short-term forecasting suit for the Latvian power system. The system load and wind power are forecasted using ANN and ARIMA models, respectively, and the forecasting accuracy is evaluated in terms of errors, mean absolute errors and mean absolute percentage errors. The investigation of influence of additional input variables on load forecasting errors is performed. The interplay of hourly loads and wind power forecasting errors is also evaluated for the Latvian power system with historical loads (the year 2011) and planned wind power capacities (the year 2023).
An in vivo model for evaluating the response of pulp to various biomaterials.
McClugage, S G; Holmstedt, J O; Malloy, R B
1980-09-01
An in vivo model has been designed to study the acute response of exposed or unexposed dental pulp to the topical application of various biomaterials. This model permits sequential microscopic observations of the microvascular system of dental pulp before and after application of pulp capping agents, cementing agents, or cavity liners. The use of this experimental model provides useful information related to the tolerability of dental pulp to various biomaterials used in dentistry. Furthermore, this model serves as a useful supplement to more traditional long term methods for evaluating the biocompatability of dental materials.
Multi-objective optimization of GENIE Earth system models.
Price, Andrew R; Myerscough, Richard J; Voutchkov, Ivan I; Marsh, Robert; Cox, Simon J
2009-07-13
The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.
A contribution toward rational modeling of the pressure-strain-rate correlation
NASA Technical Reports Server (NTRS)
Lee, Moon Joo
1990-01-01
A novel method of obtaining an analytical expression of the 'linear part' of the pressure-strain-rate tensor in terms of the anisotropy tensor of the Reynolds stresses has been developed, where the coefficients of the seven independent tensor terms are functions of the invariants of the Reynolds-stress anisotropy. The coefficients are evaluated up to fourth order in the anisotropy of the Reynolds stresses to provide guidance for development of a turbulence model.
Helen M. Maffei; Gregory M. Filip; Kristen L. Chadwick; Lance David
2008-01-01
The purpose of this analysis was to use long term permanent plots to evaluate the short-term predictive capability of the Western Root Disease Model extension (WRDM) of the Forest Vegetation Simulator (FVS) in central Oregon mixed-conifer forests in project planning situations. Measured (1991â2002) structure and density changes on a 100-acre unmanaged area in south-...
Mugabo, Lambert; Rouleau, Dominique; Odhiambo, Jackline; Nisingizwe, Marie Paul; Amoroso, Cheryl; Barebwanuwe, Peter; Warugaba, Christine; Habumugisha, Lameck; Hedt-Gauthier, Bethany L
2015-06-09
Research is essential to identify and prioritize health needs and to develop appropriate strategies to improve health outcomes. In the last decade, non-academic research capacity strengthening trainings in sub-Saharan Africa, coupled with developing research infrastructure and the provision of individual mentorship support, has been used to build health worker skills. The objectives of this review are to describe different training approaches to research capacity strengthening in sub-Saharan Africa outside academic programs, assess methods used to evaluate research capacity strengthening activities, and learn about the challenges facing research capacity strengthening and the strategies/innovations required to overcome them. The PubMed database was searched using nine search terms and articles were included if 1) they explicitly described research capacity strengthening training activities, including information on program duration, target audience, immediate program outputs and outcomes; 2) all or part of the training program took place in sub-Saharan African countries; 3) the training activities were not a formal academic program; 4) papers were published between 2000 and 2013; and 5) both abstract and full paper were available in English. The search resulted in 495 articles, of which 450 were retained; 14 papers met all inclusion criteria and were included and analysed. In total, 4136 people were trained, of which 2939 were from Africa. Of the 14 included papers, six fell in the category of short-term evaluation period and eight in the long-term evaluation period. Conduct of evaluations and use of evaluation frameworks varied between short and long term models and some trainings were not evaluated. Evaluation methods included tests, surveys, interviews, and systems approach matrix. Research capacity strengthening activities in sub-Saharan Africa outside of academic settings provide important contributions to developing in-country capacity to participate in and lead research. Institutional support, increased funds, and dedicated time for research activities are critical factors that lead to the development of successful programs. Further, knowledge sharing through scientific articles with sufficient detail is needed to enable replication of successful models in other settings.
Challenges to achieving sustainable community health development within a donor aid business model.
Ashwell, Helen; Barclay, Lesley
2010-06-01
This paper explores the paradox of donor aid being delivered through a business model through a case study in Papua New Guinea. A retrospective review of project implementation and an outcome evaluation provided an opportunity to examine the long-term results and sustainability of a large project. Analysis was informed by data collected from 175 interviews (national, provincial, district and village), 93 community discussions and observations across 10 provinces. Problems with the business model of delivering aid were evident from implementation data and in an evaluation conducted two years after project completion (2006). Compounding the business model effect were challenges of over-ambitious project goals with limited flexibility to adapt to changing circumstances, a donor payment system requiring short-term productivity and excessive reporting requirements. An overly ambitious project design, donor dominance within the business model and limited local counterpart capacity created problems in the community initiatives component of the project. Contractual pressures can negatively influence long-term outcomes that require development of local leadership and capacity. Future planning for donor project designs needs to be flexible, smaller in scope and have a longer timeframe of seven to 10 years. Donor-funded projects need to be sufficiently flexible to apply proven principles of community development, build local ownership and allow adequate time to build counterpart knowledge and skills.
Selimkhanov, Jangir; Thompson, W Clayton; Patterson, Terrell A; Hadcock, John R; Scott, Dennis O; Maurer, Tristan S; Musante, Cynthia J
2016-01-01
The purpose of this work is to develop a mathematical model of energy balance and body weight regulation that can predict species-specific response to common pre-clinical interventions. To this end, we evaluate the ability of a previously published mathematical model of mouse metabolism to describe changes in body weight and body composition in rats in response to two short-term interventions. First, we adapt the model to describe body weight and composition changes in Sprague-Dawley rats by fitting to data previously collected from a 26-day caloric restriction study. The calibrated model is subsequently used to describe changes in rat body weight and composition in a 23-day cannabinoid receptor 1 antagonist (CB1Ra) study. While the model describes body weight data well, it fails to replicate body composition changes with CB1Ra treatment. Evaluation of a key model assumption about deposition of fat and fat-free masses shows a limitation of the model in short-term studies due to the constraint placed on the relative change in body composition components. We demonstrate that the model can be modified to overcome this limitation, and propose additional measurements to further test the proposed model predictions. These findings illustrate how mathematical models can be used to support drug discovery and development by identifying key knowledge gaps and aiding in the design of additional experiments to further our understanding of disease-relevant and species-specific physiology.
Selimkhanov, Jangir; Patterson, Terrell A.; Scott, Dennis O.; Maurer, Tristan S.; Musante, Cynthia J.
2016-01-01
The purpose of this work is to develop a mathematical model of energy balance and body weight regulation that can predict species-specific response to common pre-clinical interventions. To this end, we evaluate the ability of a previously published mathematical model of mouse metabolism to describe changes in body weight and body composition in rats in response to two short-term interventions. First, we adapt the model to describe body weight and composition changes in Sprague-Dawley rats by fitting to data previously collected from a 26-day caloric restriction study. The calibrated model is subsequently used to describe changes in rat body weight and composition in a 23-day cannabinoid receptor 1 antagonist (CB1Ra) study. While the model describes body weight data well, it fails to replicate body composition changes with CB1Ra treatment. Evaluation of a key model assumption about deposition of fat and fat-free masses shows a limitation of the model in short-term studies due to the constraint placed on the relative change in body composition components. We demonstrate that the model can be modified to overcome this limitation, and propose additional measurements to further test the proposed model predictions. These findings illustrate how mathematical models can be used to support drug discovery and development by identifying key knowledge gaps and aiding in the design of additional experiments to further our understanding of disease-relevant and species-specific physiology. PMID:27227543
Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foye, Kevin C.; Soong, Te-Yang
2012-07-01
The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the wastemore » mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific example, relative density, which can be determined through field measurements, was selected as the field quality control parameter for waste placement. This technique can be extended to include a rigorous performance-based methodology using other parameters (void space criteria, debris-soil mix ratio, pre-loading, etc.). As shown in this example, each parameter range, or sets of parameter ranges can be selected such that they can result in an acceptable, long-term differential settlement according to the probabilistic model. The methodology can also be used to re-evaluate the long-term differential settlement behavior at closed land disposal facilities to identify, if any, problematic facilities so that remedial action (e.g., reinforcement of upper and intermediate waste layers) can be implemented. Considering the inherent spatial variability in waste and earth materials and the need for engineers to apply sound quantitative practices to engineering analysis, it is important to apply the available probabilistic techniques to problems of differential settlement. One such method to implement probability-based differential settlement analyses for the design of landfill final covers has been presented. The design evaluation technique presented is one tool to bridge the gap from deterministic practice to probabilistic practice. (authors)« less
Evaluation and prediction of long-term environmental effects of nonmetallic materials, second phase
NASA Technical Reports Server (NTRS)
1983-01-01
Changes in the functional properties of a number of nonmetallic materials were evaluated experimentally as a function of simulated space environments and to use such data to develop models for accelerated test methods useful for predicting such behavioral changes. The effects of changed particle irradiations on candidate space materials are evaluated.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Bennett, Robert M.
1992-01-01
The Computational Aeroelasticity Program-Transonic Small Disturbance (CAP-TSD) code, developed at LaRC, is applied to the active flexible wing wind-tunnel model for prediction of transonic aeroelastic behavior. A semi-span computational model is used for evaluation of symmetric motions, and a full-span model is used for evaluation of antisymmetric motions, and a full-span model is used for evaluation of antisymmetric motions. Static aeroelastic solutions using CAP-TSD are computed. Dynamic deformations are presented as flutter boundaries in terms of Mach number and dynamic pressure. Flutter boundaries that take into account modal refinements, vorticity and entropy corrections, antisymmetric motion, and sensitivity to the modeling of the wing tip ballast stores are also presented with experimental flutter results.
Disease model curation improvements at Mouse Genome Informatics
Bello, Susan M.; Richardson, Joel E.; Davis, Allan P.; Wiegers, Thomas C.; Mattingly, Carolyn J.; Dolan, Mary E.; Smith, Cynthia L.; Blake, Judith A.; Eppig, Janan T.
2012-01-01
Optimal curation of human diseases requires an ontology or structured vocabulary that contains terms familiar to end users, is robust enough to support multiple levels of annotation granularity, is limited to disease terms and is stable enough to avoid extensive reannotation following updates. At Mouse Genome Informatics (MGI), we currently use disease terms from Online Mendelian Inheritance in Man (OMIM) to curate mouse models of human disease. While OMIM provides highly detailed disease records that are familiar to many in the medical community, it lacks structure to support multilevel annotation. To improve disease annotation at MGI, we evaluated the merged Medical Subject Headings (MeSH) and OMIM disease vocabulary created by the Comparative Toxicogenomics Database (CTD) project. Overlaying MeSH onto OMIM provides hierarchical access to broad disease terms, a feature missing from the OMIM. We created an extended version of the vocabulary to meet the genetic disease-specific curation needs at MGI. Here we describe our evaluation of the CTD application, the extensions made by MGI and discuss the strengths and weaknesses of this approach. Database URL: http://www.informatics.jax.org/ PMID:22434831
Evaluation of Current Planetary Boundary Layer Retrieval Capabilities from Space
NASA Technical Reports Server (NTRS)
Santanello, Joseph A., Jr.; Schaefer, Alexander J.; Blaisdell, John; Yorks, John
2016-01-01
The PBL over land remains a significant gap in our water and energy cycle understanding from space. This work combines unique NASA satellite and model products to demonstrate the ability of current sensors (advanced IR sounding and lidar) to retrieve PBL properties and in turn their potential to be used globally to evaluate and improve weather and climate prediction models. While incremental progress has been made in recent AIRS retrieval versions, insufficient vertical resolution remains in terms of detecting PBL properties. Lidar shows promise in terms of detecting vertical gradients (and PBLh) in the lower troposphere, but daytime conditions over land remain a challenge due to noise, and their coverage is limited to approximately 2 weeks or longer return times.
Evaluating Vertical Moisture Structure of the Madden-Julian Oscillation in Contemporary GCMs
NASA Astrophysics Data System (ADS)
Guan, B.; Jiang, X.; Waliser, D. E.
2013-12-01
The Madden-Julian Oscillation (MJO) remains a major challenge in our understanding and modeling of the tropical convection and circulation. Many models have troubles in realistically simulating key characteristics of the MJO, such as the strength, period, and eastward propagation. For models that do simulate aspects of the MJO, it remains to be understood what parameters and processes are the most critical in determining the quality of the simulations. This study focuses on the vertical structure of moisture in MJO simulations, with the aim to identify and understand the relationship between MJO simulation qualities and key parameters related to moisture. A series of 20-year simulations conducted by 26 GCMs are analyzed, including four that are coupled to ocean models and two that have a two-dimensional cloud resolving model embedded (i.e., superparameterized). TRMM precipitation and ERA-Interim reanalysis are used to evaluate the model simulations. MJO simulation qualities are evaluated based on pattern correlations of lead/lag regressions of precipitation - a measure of the model representation of the eastward propagating MJO convection. Models with strongest and weakest MJOs (top and bottom quartiles) are compared in terms of differences in moisture content, moisture convergence, moistening rate, and moist static energy. It is found that models with strongest MJOs have better representations of the observed vertical tilt of moisture. Relative importance of convection, advection, boundary layer, and large scale convection/precipitation are discussed in terms of their contribution to the moistening process. The results highlight the overall importance of vertical moisture structure in MJO simulations. The work contributes to the climatological component of the joint WCRP-WWRP/THORPEX YOTC MJO Task Force and the GEWEX Atmosphere System Study (GASS) global model evaluation project focused on the vertical structure and diabatic processes of the MJO.
NASA Astrophysics Data System (ADS)
Sawada, Masataka; Nishimoto, Soshi; Okada, Tetsuji
2017-01-01
In high-level radioactive waste disposal repositories, there are long-term complex thermal, hydraulic, and mechanical (T-H-M) phenomena that involve the generation of heat from the waste, the infiltration of ground water, and swelling of the bentonite buffer. The ability to model such coupled phenomena is of particular importance to the repository design and assessments of its safety. We have developed a T-H-M-coupled analysis program that evaluates the long-term behavior around the repository (called "near-field"). We have also conducted centrifugal model tests that model the long-term T-H-M-coupled behavior in the near-field. In this study, we conduct H-M-coupled numerical simulations of the centrifugal near-field model tests. We compare numerical results with each other and with results obtained from the centrifugal model tests. From the comparison, we deduce that: (1) in the numerical simulation, water infiltration in the rock mass was in agreement with the experimental observation. (2) The constant-stress boundary condition in the centrifugal model tests may cause a larger expansion of the rock mass than in the in situ condition, but the mechanical boundary condition did not affect the buffer behavior in the deposition hole. (3) The numerical simulation broadly reproduced the measured bentonite pressure and the overpack displacement, but did not reproduce the decreasing trend of the bentonite pressure after 100 equivalent years. This indicates the effect of the time-dependent characteristics of the surrounding rock mass. Further investigations are needed to determine the effect of initial heterogeneity in the deposition hole and the time-dependent behavior of the surrounding rock mass.
NASA Astrophysics Data System (ADS)
Uddameri, V.
2007-01-01
Reliable forecasts of monthly and quarterly fluctuations in groundwater levels are necessary for short- and medium-term planning and management of aquifers to ensure proper service of seasonal demands within a region. Development of physically based transient mathematical models at this time scale poses considerable challenges due to lack of suitable data and other uncertainties. Artificial neural networks (ANN) possess flexible mathematical structures and are capable of mapping highly nonlinear relationships. Feed-forward neural network models were constructed and trained using the back-percolation algorithm to forecast monthly and quarterly time-series water levels at a well that taps into the deeper Evangeline formation of the Gulf Coast aquifer in Victoria, TX. Unlike unconfined formations, no causal relationships exist between water levels and hydro-meteorological variables measured near the vicinity of the well. As such, an endogenous forecasting model using dummy variables to capture short-term seasonal fluctuations and longer-term (decadal) trends was constructed. The root mean square error, mean absolute deviation and correlation coefficient ( R) were noted to be 1.40, 0.33 and 0.77 m, respectively, for an evaluation dataset of quarterly measurements and 1.17, 0.46, and 0.88 m for an evaluative monthly dataset not used to train or test the model. These statistics were better for the ANN model than those developed using statistical regression techniques.
A Model for Collaborative Working to Facilitate Knowledge Mobilisation in Public Health
ERIC Educational Resources Information Center
McCabe, Karen Elizabeth; Wallace, Annie; Crosland, Ann
2015-01-01
This paper introduces a model for collaborative working to facilitate knowledge mobilisation in public health. The model has been developed by university researchers who worked collaboratively with public health commissioners and strategic partners to evaluate a portfolio of short-term funded interventions to inform re-commissioning. Within this…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Shaocheng; Tang, Shuaiqi; Zhang, Yunyan
2016-07-01
Single-Column Model (SCM) Forcing Data are derived from the ARM facility observational data using the constrained variational analysis approach (Zhang and Lin 1997 and Zhang et al., 2001). The resulting products include both the large-scale forcing terms and the evaluation fields, which can be used for driving the SCMs and Cloud Resolving Models (CRMs) and validating model simulations.
Modeling the hydrologic impacts of forest harvesting on Florida flatwoods
Ge Sun; Hans Rierkerk; Nicholas B. Comerford
1998-01-01
The great temporal and spatial variability of pine flatwoods hydrology suggests traditional short-term field methods may not be effective in evaluating the hydrologic effects of forest management. The flatwoods model was developed, calibrated and validated specifically for the cypress wetland-pine upland landscape. The model was applied to two typical flatwoods sites...
Forecasting Techniques and Library Circulation Operations: Implications for Management.
ERIC Educational Resources Information Center
Ahiakwo, Okechukwu N.
1988-01-01
Causal regression and time series models were developed using six years of data for home borrowing, average readership, and books consulted at a university library. The models were tested for efficacy in producing short-term planning and control data. Combined models were tested in establishing evaluation measures. (10 references) (Author/MES)
The impact of climate change on surface-level ozone is examined through a multiscale modeling effort that linked global and regional climate models to drive air quality model simulations. Results are quantified in terms of the relative response factor (RRFE), which estimates the ...
Evaluation of impact of length of calibration time period on the APEX model streamflow simulation
USDA-ARS?s Scientific Manuscript database
Due to resource constraints, continuous long-term measured data for model calibration and validation (C/V) are rare. As a result, most hydrologic and water quality models are calibrated and, if possible, validated using limited available measured data. However, little research has been carried out t...
L'archivage a long terme de la maquette numerique trois-dimensionnelle annotee
NASA Astrophysics Data System (ADS)
Kheddouci, Fawzi
The use of engineering drawings in the development of mechanical products, including the exchange of engineering data as well as for archiving, is common industry practice. Traditionally, paper has been the mean to deliver those needs. However, these practices have evolved in favour of computerized tools and methods for the creation, diffusion and preservation of data involved in the process of developing aeronautical products characterized by life cycles that can exceed 70 years. Therefore, it is necessary to redefine how to maintain this data in a context whereby engineering drawings are being replaced by the 3D annotated digital mock-up. This thesis addresses the issue of long-term archiving of 3D annotated digital mock-ups, which includes geometric and dimensional tolerances, as well as other notes and specifications, in compliance with the requirements formulated by the aviation industry including regulatory and legal requirements. First, we review the requirements imposed by the aviation industry in the context of long-term archiving of 3D annotated digital mock-ups. We then consider alternative solutions. We begin by identifying the theoretical approach behind the choice of a conceptual model for digital long-term archiving. Then we evaluate, among the proposed alternatives, an archiving format that will guarantee the preservation of the integrity of the 3D annotated model (geometry, tolerances and other metadata) and its sustainability. The evaluation of 3D PDF PRC as a potential archiving format is carried out on a sample of 185 3D CATIA V5 models (parts and assemblies) provided by industrial partners. This evaluation is guided by a set of criteria including the transfer of geometry, 3D annotations, views, captures and parts positioning in assembly. The results indicate that maintaining the exact geometry is done successfully when transferring CATIA V5 models to 3D PDF PRC. Concerning the transfer of 3D annotations, we observed degradation associated with their display on the 3D model. This problem can, however, be solved by performing the conversion of the native model to STEP first, and then to 3D PDF PRC. In view of current tools, PDF 3D PRC is considered as a potential solution for long-term archiving of 3D annotated models for individual parts. However, this solution is currently not deemed adequate for archiving assemblies. The practice of 2D drawing will thus remain, in the short term, for assemblies.
A merged model of quality improvement and evaluation: maximizing return on investment.
Woodhouse, Lynn D; Toal, Russ; Nguyen, Trang; Keene, DeAnna; Gunn, Laura; Kellum, Andrea; Nelson, Gary; Charles, Simone; Tedders, Stuart; Williams, Natalie; Livingood, William C
2013-11-01
Quality improvement (QI) and evaluation are frequently considered to be alternative approaches for monitoring and assessing program implementation and impact. The emphasis on third-party evaluation, particularly associated with summative evaluation, and the grounding of evaluation in the social and behavioral science contrast with an emphasis on the integration of QI process within programs or organizations and its origins in management science and industrial engineering. Working with a major philanthropic organization in Georgia, we illustrate how a QI model is integrated with evaluation for five asthma prevention and control sites serving poor and underserved communities in rural and urban Georgia. A primary foundation of this merged model of QI and evaluation is a refocusing of the evaluation from an intimidating report card summative evaluation by external evaluators to an internally engaged program focus on developmental evaluation. The benefits of the merged model to both QI and evaluation are discussed. The use of evaluation based logic models can help anchor a QI program in evidence-based practice and provide linkage between process and outputs with the longer term distal outcomes. Merging the QI approach with evaluation has major advantages, particularly related to enhancing the funder's return on investment. We illustrate how a Plan-Do-Study-Act model of QI can (a) be integrated with evaluation based logic models, (b) help refocus emphasis from summative to developmental evaluation, (c) enhance program ownership and engagement in evaluation activities, and (d) increase the role of evaluators in providing technical assistance and support.
NASA Astrophysics Data System (ADS)
Zarei, Moslem
2016-06-01
In conventional model-independent approaches, the power spectrum of primordial perturbations is characterized by such free parameters as the spectral index, its running, the running of running, and the tensor-to-scalar ratio. In this work we show that, at least for simple inflationary potentials, one can find the primordial scalar and tensor power spectra exactly by resumming over all the running terms. In this model-dependent method, we expand the power spectra about the pivot scale to find the series terms as functions of the e-folding number for some single field models of inflation. Interestingly, for the viable models studied here, one can sum over all the terms and evaluate the exact form of the power spectra. This in turn gives more accurate parametrization of the specific models studied in this work. We finally compare our results with recent cosmic microwave background data to find that our new power spectra are in good agreement with the data.
GEM-CEDAR Study of Ionospheric Energy Input and Joule Dissipation
NASA Technical Reports Server (NTRS)
Rastaetter, Lutz; Kuznetsova, Maria M.; Shim, Jasoon
2012-01-01
We are studying ionospheric model performance for six events selected for the GEM-CEDAR modeling challenge. DMSP measurements of electric and magnetic fields are converted into Poynting Flux values that estimate the energy input into the ionosphere. Models generate rates of ionospheric Joule dissipation that are compared to the energy influx. Models include the ionosphere models CTIPe and Weimer and the ionospheric electrodynamic outputs of global magnetosphere models SWMF, LFM, and OpenGGCM. This study evaluates the model performance in terms of overall balance between energy influx and dissipation and tests the assumption that Joule dissipation occurs locally where electromagnetic energy flux enters the ionosphere. We present results in terms of skill scores now commonly used in metrics and validation studies and we can measure the agreement in terms of temporal and spatial distribution of dissipation (i.e, location of auroral activity) along passes of the DMSP satellite with the passes' proximity to the magnetic pole and solar wind activity level.
A synoptic view of the Third Uniform California Earthquake Rupture Forecast (UCERF3)
Field, Edward; Jordan, Thomas H.; Page, Morgan T.; Milner, Kevin R.; Shaw, Bruce E.; Dawson, Timothy E.; Biasi, Glenn; Parsons, Thomas E.; Hardebeck, Jeanne L.; Michael, Andrew J.; Weldon, Ray; Powers, Peter; Johnson, Kaj M.; Zeng, Yuehua; Bird, Peter; Felzer, Karen; van der Elst, Nicholas; Madden, Christopher; Arrowsmith, Ramon; Werner, Maximillan J.; Thatcher, Wayne R.
2017-01-01
Probabilistic forecasting of earthquake‐producing fault ruptures informs all major decisions aimed at reducing seismic risk and improving earthquake resilience. Earthquake forecasting models rely on two scales of hazard evolution: long‐term (decades to centuries) probabilities of fault rupture, constrained by stress renewal statistics, and short‐term (hours to years) probabilities of distributed seismicity, constrained by earthquake‐clustering statistics. Comprehensive datasets on both hazard scales have been integrated into the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3). UCERF3 is the first model to provide self‐consistent rupture probabilities over forecasting intervals from less than an hour to more than a century, and it is the first capable of evaluating the short‐term hazards that result from multievent sequences of complex faulting. This article gives an overview of UCERF3, illustrates the short‐term probabilities with aftershock scenarios, and draws some valuable scientific conclusions from the modeling results. In particular, seismic, geologic, and geodetic data, when combined in the UCERF3 framework, reject two types of fault‐based models: long‐term forecasts constrained to have local Gutenberg–Richter scaling, and short‐term forecasts that lack stress relaxation by elastic rebound.
Safari, Saeed; Baratloo, Alireza; Hashemi, Behrooz; Rahmati, Farhad; Forouzanfar, Mohammad Mehdi; Motamedi, Maryam; Mirmohseni, Ladan
2016-01-01
Background: Determining etiologic causes and prognosis can significantly improve management of syncope patients. The present study aimed to compare the values of San Francisco, Osservatorio Epidemiologico sulla Sincope nel Lazio (OESIL), Boston, and Risk Stratification of Syncope in the Emergency Department (ROSE) score clinical decision rules in predicting the short-term serious outcome of syncope patients. Materials and Methods: The present diagnostic accuracy study with 1-week follow-up was designed to evaluate the predictive values of the four mentioned clinical decision rules. Screening performance characteristics of each model in predicting mortality, myocardial infarction (MI), and cerebrovascular accidents (CVAs) were calculated and compared. To evaluate the value of each aforementioned model in predicting the outcome, sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio were calculated and receiver-operating curve (ROC) curve analysis was done. Results: A total of 187 patients (mean age: 64.2 ± 17.2 years) were enrolled in the study. Mortality, MI, and CVA were seen in 19 (10.2%), 12 (6.4%), and 36 (19.2%) patients, respectively. Area under the ROC curve for OESIL, San Francisco, Boston, and ROSE models in prediction the risk of 1-week mortality, MI, and CVA was in the 30–70% range, with no significant difference among models (P > 0.05). The pooled model did not show higher accuracy in prediction of mortality, MI, and CVA compared to others (P > 0.05). Conclusion: This study revealed the weakness of all four evaluated models in predicting short-term serious outcome of syncope patients referred to the emergency department without any significant advantage for one among others. PMID:27904602
Use of a business excellence model to improve conservation programs.
Black, Simon; Groombridge, Jim
2010-12-01
The current shortfall in effectiveness within conservation biology is illustrated by increasing interest in "evidence-based conservation," whose proponents have identified the need to benchmark conservation initiatives against actions that lead to proven positive effects. The effectiveness of conservation policies, approaches, and evaluation is under increasing scrutiny, and in these areas models of excellence used in business could prove valuable. Typically, conservation programs require years of effort and involve rigorous long-term implementation processes. Successful balance of long-term efforts alongside the achievement of short-term goals is often compromised by management or budgetary constraints, a situation also common in commercial businesses. "Business excellence" is an approach many companies have used over the past 20 years to ensure continued success. Various business excellence evaluations have been promoted that include concepts that could be adapted and applied in conservation programs. We describe a conservation excellence model that shows how scientific processes and results can be aligned with financial and organizational measures of success. We applied the model to two well-documented species conservation programs. In the first, the Po'ouli program, several aspects of improvement were identified, such as more authority for decision making in the field and better integration of habitat management and population recovery processes. The second example, the black-footed ferret program, could have benefited from leadership effort to reduce bureaucracy and to encourage use of best-practice species recovery approaches. The conservation excellence model enables greater clarity in goal setting, more-effective identification of job roles within programs, better links between technical approaches and measures of biological success, and more-effective use of resources. The model could improve evaluation of a conservation program's effectiveness and may be used to compare different programs, for example during reviews of project performance by sponsoring organizations. © 2010 Society for Conservation Biology.
Valentine, William J; Pollock, Richard F; Saunders, Rhodri; Bae, Jay; Norrbacka, Kirsi; Boye, Kristina
Recent publications describing long-term follow-up from landmark trials and diabetes registries represent an opportunity to revisit modeling options in type 1 diabetes mellitus (T1DM). To develop a new product-independent model capable of predicting long-term clinical and cost outcomes. After a systematic literature review to identify clinical trial and registry data, a model was developed (the PRIME Diabetes Model) to simulate T1DM progression and complication onset. The model runs as a patient-level simulation, making use of covariance matrices for cohort generation and risk factor progression, and simulating myocardial infarction, stroke, angina, heart failure, nephropathy, retinopathy, macular edema, neuropathy, amputation, hypoglycemia, ketoacidosis, mortality, and risk factor evolution. Several approaches novel to T1DM modeling were used, including patient characteristics and risk factor covariance, a glycated hemoglobin progression model derived from patient-level data, and model averaging approaches to evaluate complication risk. Validation analyses comparing modeled outcomes with published studies demonstrated that the PRIME Diabetes Model projects long-term patient outcomes consistent with those reported for a number of long-term studies. Macrovascular end points were reliably reproduced across five different populations and microvascular complication risk was accurately predicted on the basis of comparisons with landmark studies and published registry data. The PRIME Diabetes Model is product-independent, available online, and has been developed in line with good practice guidelines. Validation has indicated that outcomes from long-term studies can be reliably reproduced. The model offers new approaches to long-standing challenges in diabetes modeling and may become a valuable tool for informing health care policy. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Huestis, David L.
2006-01-01
We propose to establish a long-term program of critical evaluation by domain experts of the rates and cross sections for atomic and molecular processes that are needed for understanding and modeling the atmospheres in the solar system. We envision data products resembling those of the JPL/NASA Panel for Data Evaluation and the similar efforts of the international combustion modeling community funded by US DoE and its European counterpart.
Marsh, John E.; Pilgrim, Lea K.; Sörqvist, Patrik
2013-01-01
Serial short-term memory is impaired by irrelevant sound, particularly when the sound changes acoustically. This acoustic effect is larger when the sound is presented to the left compared to the right ear (a left-ear disadvantage). Serial memory appears relatively insensitive to distraction from the semantic properties of a background sound. In contrast, short-term free recall of semantic-category exemplars is impaired by the semantic properties of background speech and is relatively insensitive to the sound's acoustic properties. This semantic effect is larger when the sound is presented to the right compared to the left ear (a right-ear disadvantage). In this paper, we outline a speculative neurocognitive fine-coarse model of these hemispheric differences in relation to short-term memory and selective attention, and explicate empirical directions in which this model can be critically evaluated. PMID:24399988
Dynamic evaluation of two decades of WRF-CMAQ ozone simulations over the contiguous United States
NASA Astrophysics Data System (ADS)
Astitha, Marina; Luo, Huiying; Rao, S. Trivikrama; Hogrefe, Christian; Mathur, Rohit; Kumar, Naresh
2017-09-01
Dynamic evaluation of the fully coupled Weather Research and Forecasting (WRF)- Community Multi-scale Air Quality (CMAQ) model ozone simulations over the contiguous United States (CONUS) using two decades of simulations covering the period from 1990 to 2010 is conducted to assess how well the changes in observed ozone air quality are simulated by the model. The changes induced by variations in meteorology and/or emissions are also evaluated during the same timeframe using spectral decomposition of observed and modeled ozone time series with the aim of identifying the underlying forcing mechanisms that control ozone exceedances and making informed recommendations for the optimal use of regional-scale air quality models. The evaluation is focused on the warm season's (i.e., May-September) daily maximum 8-hr (DM8HR) ozone concentrations, the 4th highest (4th) and average of top 10 DM8HR ozone values (top10), as well as the spectrally-decomposed components of the DM8HR ozone time series using the Kolmogorov-Zurbenko (KZ) filter. Results of the dynamic evaluation are presented for six regions in the U.S., consistent with the National Oceanic and Atmospheric Administration (NOAA) climatic regions. During the earlier 11-yr period (1990-2000), the simulated and observed regional average trends are not statistically significant. During the more recent 2000-2010 period, all observed trends are statistically significant and WRF-CMAQ captures the observed downward trend in the Southwest and Midwest but under-predicts the downward trends in observations for the other regions. Observational analysis reveals that it is the magnitude of the long-term forcing that dictates the maximum ozone exceedance potential; there is a strong linear relationship between the long-term forcing and the 4th highest or the average of the top10 ozone concentrations in both observations and model output. This finding indicates that improving the model's ability to reproduce the long-term component will also enable better simulation of ozone extreme values that are of interest to regulatory agencies.
Turbulence measurements in axisymmetric jets of air and helium. I - Air jet. II - Helium jet
NASA Technical Reports Server (NTRS)
Panchapakesan, N. R.; Lumley, J. L.
1993-01-01
Results are presented of measurements on turbulent round jets of air and of helium of the same nozzle momentum efflux, using, for the air jets, x-wire hot-wire probes mounted on a moving shuttle and, for He jets, a composite probe consisting of an interference probe of the Way-Libby type and an x-probe. Current models for scalar triple moments were evaluated. It was found that the performance of the model termed the Full model, which includes all terms except advection, was very good for both the air and the He jets.
Cure Models as a Useful Statistical Tool for Analyzing Survival
Othus, Megan; Barlogie, Bart; LeBlanc, Michael L.; Crowley, John J.
2013-01-01
Cure models are a popular topic within statistical literature but are not as widely known in the clinical literature. Many patients with cancer can be long-term survivors of their disease, and cure models can be a useful tool to analyze and describe cancer survival data. The goal of this article is to review what a cure model is, explain when cure models can be used, and use cure models to describe multiple myeloma survival trends. Multiple myeloma is generally considered an incurable disease, and this article shows that by using cure models, rather than the standard Cox proportional hazards model, we can evaluate whether there is evidence that therapies at the University of Arkansas for Medical Sciences induce a proportion of patients to be long-term survivors. PMID:22675175
Markov models in dentistry: application to resin-bonded bridges and review of the literature.
Mahl, Dominik; Marinello, Carlo P; Sendi, Pedram
2012-10-01
Markov models are mathematical models that can be used to describe disease progression and evaluate the cost-effectiveness of medical interventions. Markov models allow projecting clinical and economic outcomes into the future and are therefore frequently used to estimate long-term outcomes of medical interventions. The purpose of this paper is to demonstrate its use in dentistry, using the example of resin-bonded bridges to replace missing teeth, and to review the literature. We used literature data and a four-state Markov model to project long-term outcomes of resin-bonded bridges over a time horizon of 60 years. In addition, the literature was searched in PubMed Medline for research articles on the application of Markov models in dentistry.
A percolation model for electrical conduction in wood with implications for wood-water relations
Samuel L. Zelinka; Samuel V. Glass; Donald S. Stone
2008-01-01
The first models used to describe electrical conduction in cellulosic materials involved conduction pathways through free water. These models were abandoned in the middle of the 20th century. This article re-evaluates the theory of conduction in wood by using a percolation model that describes electrical conduction in terms of overlapping paths of loosely bound or...
Evaluating digital libraries in the health sector. Part 1: measuring inputs and outputs.
Cullen, Rowena
2003-12-01
This is the first part of a two-part paper which explores methods that can be used to evaluate digital libraries in the health sector. In this first part, some approaches to evaluation that have been proposed for mainstream digital information services are examined for their suitability to provide models for the health sector. The paper summarizes some major national and collaborative initiatives to develop measures for digital libraries, and analyses these approaches in terms of their relationship to traditional measures of library performance, which are focused on inputs and outputs, and their relevance to current debates among health information specialists. The second part* looks more specifically at evaluative models based on outcomes, and models being developed in the health sector.
Watershed scale response to climate change--Yampa River Basin, Colorado
Hay, Lauren E.; Battaglin, William A.; Markstrom, Steven L.
2012-01-01
General Circulation Model simulations of future climate through 2099 project a wide range of possible scenarios. To determine the sensitivity and potential effect of long-term climate change on the freshwater resources of the United States, the U.S. Geological Survey Global Change study, "An integrated watershed scale response to global change in selected basins across the United States" was started in 2008. The long-term goal of this national study is to provide the foundation for hydrologically based climate change studies across the nation. Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Yampa River Basin at Steamboat Springs, Colorado.
USDA-ARS?s Scientific Manuscript database
Agricultural research increasingly is expected to provide precise, quantitative information with an explicit geographic coverage. Limited availability of continuous daily meteorological records often constrains efforts to provide such information through integrated use of simulation models, spatial ...
Small Business Training Models for Community Growth.
ERIC Educational Resources Information Center
Jellison, Holly M., Ed.
Nine successful community college programs for small business management training are described in this report in terms of their college and economic context, purpose, offerings, delivery modes, operating and marketing strategies, community outreach, support services, faculty and staff, evaluation, and future directions. The model programs are…
Maurer, Max; Lienert, Judit
2017-01-01
We compare the use of multi-criteria decision analysis (MCDA)–or more precisely, models used in multi-attribute value theory (MAVT)–to integrated assessment (IA) models for supporting long-term water supply planning in a small town case study in Switzerland. They are used to evaluate thirteen system scale water supply alternatives in four future scenarios regarding forty-four objectives, covering technical, social, environmental, and economic aspects. The alternatives encompass both conventional and unconventional solutions and differ regarding technical, spatial and organizational characteristics. This paper focuses on the impact assessment and final evaluation step of the structured MCDA decision support process. We analyze the performance of the alternatives for ten stakeholders. We demonstrate the implications of model assumptions by comparing two IA and three MAVT evaluation model layouts of different complexity. For this comparison, we focus on the validity (ranking stability), desirability (value), and distinguishability (value range) of the alternatives given the five model layouts. These layouts exclude or include stakeholder preferences and uncertainties. Even though all five led us to identify the same best alternatives, they did not produce identical rankings. We found that the MAVT-type models provide higher distinguishability and a more robust basis for discussion than the IA-type models. The needed complexity of the model, however, should be determined based on the intended use of the model within the decision support process. The best-performing alternatives had consistently strong performance for all stakeholders and future scenarios, whereas the current water supply system was outperformed in all evaluation layouts. The best-performing alternatives comprise proactive pipe rehabilitation, adapted firefighting provisions, and decentralized water storage and/or treatment. We present recommendations for possible ways of improving water supply planning in the case study and beyond. PMID:28481881
Schedl, Markus
2012-01-01
Different term weighting techniques such as [Formula: see text] or BM25 have been used intensely for manifold text-based information retrieval tasks. Their use for modeling term profiles for named entities and subsequent calculation of similarities between these named entities have been studied to a much smaller extent. The recent trend of microblogging made available massive amounts of information about almost every topic around the world. Therefore, microblogs represent a valuable source for text-based named entity modeling. In this paper, we present a systematic and comprehensive evaluation of different term weighting measures , normalization techniques , query schemes , index term sets , and similarity functions for the task of inferring similarities between named entities, based on data extracted from microblog posts . We analyze several thousand combinations of choices for the above mentioned dimensions, which influence the similarity calculation process, and we investigate in which way they impact the quality of the similarity estimates. Evaluation is performed using three real-world data sets: two collections of microblogs related to music artists and one related to movies. For the music collections, we present results of genre classification experiments using as benchmark genre information from allmusic.com. For the movie collection, we present results of multi-class classification experiments using as benchmark categories from IMDb. We show that microblogs can indeed be exploited to model named entity similarity with remarkable accuracy, provided the correct settings for the analyzed aspects are used. We further compare the results to those obtained when using Web pages as data source.
What do we gain with Probabilistic Flood Loss Models?
NASA Astrophysics Data System (ADS)
Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.
2015-12-01
The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Confronting uncertainty in flood damage predictions
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2015-04-01
Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Evaluation of Data-Driven Models for Predicting Solar Photovoltaics Power Output
Moslehi, Salim; Reddy, T. Agami; Katipamula, Srinivas
2017-09-10
This research was undertaken to evaluate different inverse models for predicting power output of solar photovoltaic (PV) systems under different practical scenarios. In particular, we have investigated whether PV power output prediction accuracy can be improved if module/cell temperature was measured in addition to climatic variables, and also the extent to which prediction accuracy degrades if solar irradiation is not measured on the plane of array but only on a horizontal surface. We have also investigated the significance of different independent or regressor variables, such as wind velocity and incident angle modifier in predicting PV power output and cell temperature.more » The inverse regression model forms have been evaluated both in terms of their goodness-of-fit, and their accuracy and robustness in terms of their predictive performance. Given the accuracy of the measurements, expected CV-RMSE of hourly power output prediction over the year varies between 3.2% and 8.6% when only climatic data are used. Depending on what type of measured climatic and PV performance data is available, different scenarios have been identified and the corresponding appropriate modeling pathways have been proposed. The corresponding models are to be implemented on a controller platform for optimum operational planning of microgrids and integrated energy systems.« less
NASA Astrophysics Data System (ADS)
Li, Y.; McDougall, T. J.
2016-02-01
Coarse resolution ocean models lack knowledge of spatial correlations between variables on scales smaller than the grid scale. Some researchers have shown that these spatial correlations play a role in the poleward heat flux. In order to evaluate the poleward transport induced by the spatial correlations at a fixed horizontal position, an equation is obtained to calculate the approximate transport from velocity gradients. The equation involves two terms that can be added to the quasi-Stokes streamfunction (based on temporal correlations) to incorporate the contribution of spatial correlations. Moreover, these new terms do not need to be parameterized and is ready to be evaluated by using model data directly. In this study, data from a high resolution ocean model have been used to estimate the accuracy of this HRM approach for improving the horizontal property fluxes in coarse-resolution ocean models. A coarse grid is formed by sub-sampling and box-car averaging the fine grid scale. The transport calculated on the coarse grid is then compared to the transport on original high resolution grid scale accumulated over a corresponding number of grid boxes. The preliminary results have shown that the estimate on coarse resolution grids roughly match the corresponding transports on high resolution grids.
Evaluation of Stress Management Education: The University of Maryland Model.
ERIC Educational Resources Information Center
Allen, Roger J.
This study evaluated the efficacy of the undergraduate service program "Controlling Stress & Tension" at the University of Maryland in terms of improving the health status of participants across biomedical stress reactivity and psychometric variables. Six hundred fifty-three participants were compared to 264 control subjects for pre-…
USDA-ARS?s Scientific Manuscript database
Bidirectional Reflectance Distribution Function (BRDF) model parameters, Albedo quantities, and Nadir BRDF Adjusted Reflectance (NBAR) products derived from the Visible Infrared Imaging Radiometer Suite (VIIRS), on the Suomi-NPP (National Polar-orbiting Partnership) satellite are evaluated through c...
Kaur, A; Takhar, P S; Smith, D M; Mann, J E; Brashears, M M
2008-10-01
A fractional differential equations (FDEs)-based theory involving 1- and 2-term equations was developed to predict the nonlinear survival and growth curves of foodborne pathogens. It is interesting to note that the solution of 1-term FDE leads to the Weibull model. Nonlinear regression (Gauss-Newton method) was performed to calculate the parameters of the 1-term and 2-term FDEs. The experimental inactivation data of Salmonella cocktail in ground turkey breast, ground turkey thigh, and pork shoulder; and cocktail of Salmonella, E. coli, and Listeria monocytogenes in ground beef exposed at isothermal cooking conditions of 50 to 66 degrees C were used for validation. To evaluate the performance of 2-term FDE in predicting the growth curves-growth of Salmonella typhimurium, Salmonella Enteritidis, and background flora in ground pork and boneless pork chops; and E. coli O157:H7 in ground beef in the temperature range of 22.2 to 4.4 degrees C were chosen. A program was written in Matlab to predict the model parameters and survival and growth curves. Two-term FDE was more successful in describing the complex shapes of microbial survival and growth curves as compared to the linear and Weibull models. Predicted curves of 2-term FDE had higher magnitudes of R(2) (0.89 to 0.99) and lower magnitudes of root mean square error (0.0182 to 0.5461) for all experimental cases in comparison to the linear and Weibull models. This model was capable of predicting the tails in survival curves, which was not possible using Weibull and linear models. The developed model can be used for other foodborne pathogens in a variety of food products to study the destruction and growth behavior.
Matsushima, Kazuhide; Peng, Monica; Velasco, Carlos; Schaefer, Eric; Diaz-Arrastia, Ramon; Frankel, Heidi
2012-04-01
Significant glycemic excursions (so-called glucose variability) affect the outcome of generic critically ill patients but has not been well studied in patients with traumatic brain injury (TBI). The purpose of this study was to evaluate the impact of glucose variability on long-term functional outcome of patients with TBI. A noncomputerized tight glucose control protocol was used in our intensivist model surgical intensive care unit. The relationship between the glucose variability and long-term (a median of 6 months after injury) functional outcome defined by extended Glasgow Outcome Scale (GOSE) was analyzed using ordinal logistic regression models. Glucose variability was defined by SD and percentage of excursion (POE) from the preset range glucose level. A total of 109 patients with TBI under tight glucose control had long-term GOSE evaluated. In univariable analysis, there was a significant association between lower GOSE score and higher mean glucose, higher SD, POE more than 60, POE 80 to 150, and single episode of glucose less than 60 mg/dL but not POE 80 to 110. After adjusting for possible confounding variables in multivariable ordinal logistic regression models, higher SD, POE more than 60, POE 80 to 150, and single episode of glucose less than 60 mg/dL were significantly associated with lower GOSE score. Glucose variability was significantly associated with poorer long-term functional outcome in patients with TBI as measured by the GOSE score. Well-designed protocols to minimize glucose variability may be key in improving long-term functional outcome. Copyright © 2012 Elsevier Inc. All rights reserved.
Lee, Eun Gyung; Harper, Martin; Bowen, Russell B; Slaven, James
2009-07-01
The current study evaluated the Control of Substances Hazardous to Health (COSHH) Essentials model for short-term task-based exposures and full-shift exposures using measured concentrations of three volatile organic chemicals at a small printing plant. A total of 188 exposure measurements of isopropanol and 187 measurements of acetone were collected and each measurement took approximately 60 min. Historically, collected time-weighted average concentrations (seven results) were evaluated for methylene chloride. The COSHH Essentials model recommended general ventilation control for both isopropanol and acetone. There was good agreement between the task-based exposure measurements and the COSHH Essentials predicted exposure range (PER) for cleaning and print preparation with isopropanol and for cleaning with acetone. For the other tasks and for full-shift exposures, agreement between the exposure measurements and the PER was either moderate or poor. However, for both isopropanol and acetone, our findings suggested that the COSHH Essentials model worked reasonably well because the probabilities of short-term exposure measurements exceeding short-term occupational exposure limits (OELs) or full-shift exposures exceeding the corresponding full-shift OELs were <0.05 under the recommended control strategy. For methylene chloride, the COSHH Essentials recommended containment control but a follow-up study was not able to be performed because it had already been replaced with a less hazardous substance (acetone). This was considered a more acceptable alternative to increasing the level of control.
Improved climate model evaluation using a new, 750-year Antarctic-wide snow accumulation product
NASA Astrophysics Data System (ADS)
Medley, B.; Thomas, E. R.
2017-12-01
Snow that accumulates over the cold, dry grounded ice of Antarctica is an important component of its mass balance, mitigating the ice sheet's contribution to sea level. Secular trends in accumulation not only result trends in the mass balance of the Antarctic Ice Sheet, but also directly and indirectly impact surface height changes. Long-term and spatiotemporally complete records of snow accumulation are needed to understand part and present Antarctic-wide mass balance, to convert from altimetry derived volume change to mass change, and to evaluate the ability of climate models to reproduce the observed climate change. We need measurements in both time and space, yet they typically sample one dimension at the expense of the other. Here, we develop a spatially complete, annually resolved snow accumulation product for the Antarctic Ice Sheet over the past 750 years by combining a newly compiled database of ice core accumulation records with climate model output. We mainly focus on climate model evaluation. Because the product spans several centuries, we can evaluate model ability in representing the preindustrial as well as present day accumulation change. Significant long-term trends in snow accumulation are found over the Ross and Bellingshausen Sea sectors of West Antarctica, the Antarctic Peninsula, and several sectors in East Antarctica. These results suggest that change is more complex over the Antarctic Ice Sheet than a simple uniform change (i.e., more snowfall in a warming world), which highlights the importance of atmospheric circulation as a major driver of change. By evaluating several climate models' ability to reproduce the observed trends, we can deduce whether their projections are reasonable or potentially biased where the latter would result in a misrepresentation of the Antarctic contribution to sea level.
Directly Comparing Computer and Human Performance in Language Understanding and Visual Reasoning.
ERIC Educational Resources Information Center
Baker, Eva L.; And Others
Evaluation models are being developed for assessing artificial intelligence (AI) systems in terms of similar performance by groups of people. Natural language understanding and vision systems are the areas of concentration. In simplest terms, the goal is to norm a given natural language system's performance on a sample of people. The specific…
Donald J. Brown; Christine A. Ribic; Deahn M. Donner; Mark D. Nelson; Carol I. Bocetti; Christie M. Deloria-Sheffield; Des Thompson
2017-01-01
Long-term management planning for conservation-reliant migratory songbirds is particularly challenging because habitat quality in different stages and geographic locations of the annual cycle can have direct and carry-over effects that influence the population dynamics. The Neotropical migratory songbird Kirtland's warbler Setophaga kirtlandii...
A human osteoarthritis osteochondral organ culture model for cartilage tissue engineering.
Yeung, P; Zhang, W; Wang, X N; Yan, C H; Chan, B P
2018-04-01
In vitro human osteoarthritis (OA)-mimicking models enabling pathophysiological studies and evaluation of emerging therapies such as cartilage tissue engineering are of great importance. We describe the development and characterization of a human OA osteochondral organ culture. We also apply this model for evaluation of the phenotype maintenance of a human MSC derived engineered cartilage, as an example of emerging therapeutics, under long term exposure to the OA-mimicking environment. We also test the sensitivity of the model to a series of external factors and a potential disease-modifying agent, in terms of chondrogenic phenotype maintenance of the engineered cartilage, under OA-mimicking environment. Excised joint tissues from total knee replacement surgeries were carved into numerous miniaturized and standardized osteochondral plugs for subsequent OA organ culture. The organ cultures were characterized in detail before being co-cultured with a tissue engineered cartilage. The chondrogenic phenotype of the tissue engineered cartilage co-cultured in long term up to 8 weeks under this OA-mimicking microenvironment was evaluated. Using the same co-culture model, we also screened for a number of biomimetic environmental factors, including oxygen tension, the presence of serum and the application of compression loading. Finally, we studied the effect of a matrix metalloprotease inhibitor, as an example of potential disease-modifying agents, on the co-cultured engineered cartilage. We demonstrate that cells in the OA organ culture were viable while both the typical chondrogenic phenotype and the characteristic OA phenotype were maintained for long period of time. We then demonstrate that upon co-culture with the OA-mimicking organ culture, the engineered cartilage initially exhibited a more fibrocartilage phenotype but progressively reverted back to the chondrogenic phenotype upon long term co-culture up to 8 weeks. The engineered cartilage was also found to be sensitive to all biomimetic environmental factors screened (oxygen tension, serum and compression). Moreover, under the effect of a MMP inhibitor, the chondrogenic phenotype of engineered cartilage was better maintained. We demonstrated the development of a human OA osteochondral organ culture and tested the feasibility and potential of using this model as an in vitro evaluation tool for emerging cartilage therapies. Copyright © 2018 Elsevier Ltd. All rights reserved.
The impact of climate change on surface level ozone is examined through a multi-scale modeling effort that linked global and regional climate models to drive air quality model simulations. Results are quantified in terms of the Relative Response Factor (RRFE), which es...
Join the Revolution: How Montessori for Aging and Dementia can Change Long-Term Care Culture.
Bourgeois, Michelle S; Brush, Jennifer; Elliot, Gail; Kelly, Anne
2015-08-01
Efforts to improve the quality of life of persons with dementia in long-term care through the implementation of various approaches to person-centered care have been underway for the past two decades. Studies have yielded conflicting reports evaluating the evidence for these approaches. The purpose of this article is to outline the findings of several systematic reviews of this literature, highlighting the areas of improvement needs, and to describe a new person-centered care model, DementiAbility Methods: The Montessori Way. This model focuses on the abilities, needs, interests, and strengths of the person and creating worthwhile and meaningful roles, routines, and activities for the person within a supportive physical environment. This is accomplished through gaining the commitment of the facility's leaders, training staff, and monitoring program implementation. The potential for a culture change in long-term care environments is dependent on the development and rigorous evaluation of person-centered care approaches. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanz Rodrigo, Javier; Chávez Arroyo, Roberto Aurelio; Moriarty, Patrick
The increasing size of wind turbines, with rotors already spanning more than 150 m diameter and hub heights above 100 m, requires proper modeling of the atmospheric boundary layer (ABL) from the surface to the free atmosphere. Furthermore, large wind farm arrays create their own boundary layer structure with unique physics. This poses significant challenges to traditional wind engineering models that rely on surface-layer theories and engineering wind farm models to simulate the flow in and around wind farms. However, adopting an ABL approach offers the opportunity to better integrate wind farm design tools and meteorological models. The challenge ismore » how to build the bridge between atmospheric and wind engineering model communities and how to establish a comprehensive evaluation process that identifies relevant physical phenomena for wind energy applications with modeling and experimental requirements. A framework for model verification, validation, and uncertainty quantification is established to guide this process by a systematic evaluation of the modeling system at increasing levels of complexity. In terms of atmospheric physics, 'building the bridge' means developing models for the so-called 'terra incognita,' a term used to designate the turbulent scales that transition from mesoscale to microscale. This range of scales within atmospheric research deals with the transition from parameterized to resolved turbulence and the improvement of surface boundary-layer parameterizations. The coupling of meteorological and wind engineering flow models and the definition of a formal model evaluation methodology, is a strong area of research for the next generation of wind conditions assessment and wind farm and wind turbine design tools. Some fundamental challenges are identified in order to guide future research in this area.« less
Vreck, D; Gernaey, K V; Rosen, C; Jeppsson, U
2006-01-01
In this paper, implementation of the Benchmark Simulation Model No 2 (BSM2) within Matlab-Simulink is presented. The BSM2 is developed for plant-wide WWTP control strategy evaluation on a long-term basis. It consists of a pre-treatment process, an activated sludge process and sludge treatment processes. Extended evaluation criteria are proposed for plant-wide control strategy assessment. Default open-loop and closed-loop strategies are also proposed to be used as references with which to compare other control strategies. Simulations indicate that the BM2 is an appropriate tool for plant-wide control strategy evaluation.
Towards cleaner combustion engines through groundbreaking detailed chemical kinetic models
Battin-Leclerc, Frédérique; Blurock, Edward; Bounaceur, Roda; Fournet, René; Glaude, Pierre-Alexandre; Herbinet, Olivier; Sirjean, Baptiste; Warth, V.
2013-01-01
In the context of limiting the environmental impact of transportation, this paper reviews new directions which are being followed in the development of more predictive and more accurate detailed chemical kinetic models for the combustion of fuels. In the first part, the performance of current models, especially in terms of the prediction of pollutant formation, is evaluated. In the next parts, recent methods and ways to improve these models are described. An emphasis is given on the development of detailed models based on elementary reactions, on the production of the related thermochemical and kinetic parameters, and on the experimental techniques available to produce the data necessary to evaluate model predictions under well defined conditions. PMID:21597604
Evaluation of a non-Arrhenius model for therapeutic monoclonal antibody aggregation.
Kayser, Veysel; Chennamsetty, Naresh; Voynov, Vladimir; Helk, Bernhard; Forrer, Kurt; Trout, Bernhardt L
2011-07-01
Understanding antibody aggregation is of great significance for the pharmaceutical industry. We studied the aggregation of five different therapeutic monoclonal antibodies (mAbs) with size-exclusion chromatography-high-performance liquid chromatography (SEC-HPLC), fluorescence spectroscopy, electron microscopy, and light scattering methods at various temperatures with the aim of gaining insight into the aggregation process and developing models of it. In particular, we find that the kinetics can be described by a second-order model and are non-Arrhenius. Thus, we develop a non-Arrhenius model to connect accelerated aggregation experiments at high temperature to long-term storage experiments at low temperature. We evaluate our model by predicting mAb aggregation and comparing it with long-term behavior. Our results suggest that the number of monomers and mAb conformations within aggregates vary with the size and age of the aggregates, and that only certain sizes of aggregates are populated in the solution. We also propose a kinetic model based on conformational changes of proteins and monomer peak loss kinetics from SEC-HPLC. This model could be employed for a detail analysis of mAb aggregation kinetics. Copyright © 2011 Wiley-Liss, Inc. and the American Pharmacists Association
Empirically evaluating decision-analytic models.
Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J
2010-08-01
Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Effects of distributed database modeling on evaluation of transaction rollbacks
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
1991-01-01
Data distribution, degree of data replication, and transaction access patterns are key factors in determining the performance of distributed database systems. In order to simplify the evaluation of performance measures, database designers and researchers tend to make simplistic assumptions about the system. The effect is studied of modeling assumptions on the evaluation of one such measure, the number of transaction rollbacks, in a partitioned distributed database system. Six probabilistic models and expressions are developed for the numbers of rollbacks under each of these models. Essentially, the models differ in terms of the available system information. The analytical results so obtained are compared to results from simulation. From here, it is concluded that most of the probabilistic models yield overly conservative estimates of the number of rollbacks. The effect of transaction commutativity on system throughout is also grossly undermined when such models are employed.
Effects of distributed database modeling on evaluation of transaction rollbacks
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
1991-01-01
Data distribution, degree of data replication, and transaction access patterns are key factors in determining the performance of distributed database systems. In order to simplify the evaluation of performance measures, database designers and researchers tend to make simplistic assumptions about the system. Here, researchers investigate the effect of modeling assumptions on the evaluation of one such measure, the number of transaction rollbacks in a partitioned distributed database system. The researchers developed six probabilistic models and expressions for the number of rollbacks under each of these models. Essentially, the models differ in terms of the available system information. The analytical results obtained are compared to results from simulation. It was concluded that most of the probabilistic models yield overly conservative estimates of the number of rollbacks. The effect of transaction commutativity on system throughput is also grossly undermined when such models are employed.
Effects of Inertial and Geometric Nonlinearities in the Simulation of Flexible Aircraft Dynamics
NASA Astrophysics Data System (ADS)
Bun Tse, Bosco Chun
This thesis examines the relative importance of the inertial and geometric nonlinearities in modelling the dynamics of a flexible aircraft. Inertial nonlinearities are derived by employing an exact definition of the velocity distribution and lead to coupling between the rigid body and elastic motions. The geometric nonlinearities are obtained by applying nonlinear theory of elasticity to the deformations. Peters' finite state unsteady aerodynamic model is used to evaluate the aerodynamic forces. Three approximate models obtained by excluding certain combinations of nonlinear terms are compared with that of the complete dynamics equations to obtain an indication of which terms are required for an accurate representation of the flexible aircraft behavior. A generic business jet model is used for the analysis. The results indicate that the nonlinear terms have a significant effect for more flexible aircraft, especially the geometric nonlinearities which leads to increased damping in the dynamics.
NASA Astrophysics Data System (ADS)
Sivavaraprasad, G.; Venkata Ratnam, D.
2017-07-01
Ionospheric delay is one of the major atmospheric effects on the performance of satellite-based radio navigation systems. It limits the accuracy and availability of Global Positioning System (GPS) measurements, related to critical societal and safety applications. The temporal and spatial gradients of ionospheric total electron content (TEC) are driven by several unknown priori geophysical conditions and solar-terrestrial phenomena. Thereby, the prediction of ionospheric delay is challenging especially over Indian sub-continent. Therefore, an appropriate short/long-term ionospheric delay forecasting model is necessary. Hence, the intent of this paper is to forecast ionospheric delays by considering day to day, monthly and seasonal ionospheric TEC variations. GPS-TEC data (January 2013-December 2013) is extracted from a multi frequency GPS receiver established at K L University, Vaddeswaram, Guntur station (geographic: 16.37°N, 80.37°E; geomagnetic: 7.44°N, 153.75°E), India. An evaluation, in terms of forecasting capabilities, of three ionospheric time delay models - an Auto Regressive Moving Average (ARMA) model, Auto Regressive Integrated Moving Average (ARIMA) model, and a Holt-Winter's model is presented. The performances of these models are evaluated through error measurement analysis during both geomagnetic quiet and disturbed days. It is found that, ARMA model is effectively forecasting the ionospheric delay with an accuracy of 82-94%, which is 10% more superior to ARIMA and Holt-Winter's models. Moreover, the modeled VTEC derived from International Reference Ionosphere, IRI (IRI-2012) model and new global TEC model, Neustrelitz TEC Model (NTCM-GL) have compared with forecasted VTEC values of ARMA, ARIMA and Holt-Winter's models during geomagnetic quiet days. The forecast results are indicating that ARMA model would be useful to set up an early warning system for ionospheric disturbances at low latitude regions.
Boudreau, Michelle Anne; Jensen, Jan L; Edgecombe, Nancy; Clarke, Barry; Burge, Frederick; Archibald, Greg; Taylor, Anthony; Andrew, Melissa K
2013-01-01
Background Prior to the implementation of a new model of care in long-term care facilities in the Capital District Health Authority, Halifax, Nova Scotia, residents entering long-term care were responsible for finding their own family physician. As a result, care was provided by many family physicians responsible for a few residents leading to care coordination and continuity challenges. In 2009, Capital District Health Authority (CDHA) implemented a new model of long-term care called “Care by Design” which includes: a dedicated family physician per floor, 24/7 on-call physician coverage, implementation of a standardized geriatric assessment tool, and an interdisciplinary team approach to care. In addition, a new Emergency Health Services program was implemented shortly after, in which specially trained paramedics dedicated to long-term care responses are able to address urgent care needs. These changes were implemented to improve primary and emergency care for vulnerable residents. Here we describe a comprehensive mixed methods research study designed to assess the impact of these programs on care delivery and resident outcomes. The results of this research will be important to guide primary care policy for long-term care. Objective We aim to evaluate the impact of introducing a new model of a dedicated primary care physician and team approach to long-term care facilities in the CDHA using a mixed methods approach. As a mixed methods study, the quantitative and qualitative data findings will inform each other. Quantitatively we will measure a number of indicators of care in CDHA long-term care facilities pre and post-implementation of the new model. In the qualitative phase of the study we will explore the experience under the new model from the perspectives of stakeholders including family doctors, nurses, administration and staff as well as residents and family members. The proposed mixed method study seeks to evaluate and make policy recommendations related to primary care in long-term care facilities with a focus on end-of-life care and dementia. Methods This is a mixed methods study with concurrent quantitative and qualitative phases. In the quantitative phase, a retrospective time series study is being conducted. Planned analyses will measure indicators of clinical, system, and health outcomes across three time periods and assess the effect of Care by Design as a whole and its component parts. The qualitative methods explore the experiences of stakeholders (ie, physicians, nurses, paramedics, care assistants, administrators, residents, and family members) through focus groups and in depth individual interviews. Results Data collection will be completed in fall 2013. Conclusions This study will generate a considerable amount of outcome data with applications for care providers, health care systems, and applications for program evaluation and quality improvement. Using the mixed methods design, this study will provide important results for stakeholders, as well as other health systems considering similar programs. In addition, this study will advance methods used to research new multifaceted interdisciplinary health delivery models using multiple and varied data sources and contribute to the discussion on evidence based health policy and program development. PMID:24292200
Marshall, Emily Gard; Boudreau, Michelle Anne; Jensen, Jan L; Edgecombe, Nancy; Clarke, Barry; Burge, Frederick; Archibald, Greg; Taylor, Anthony; Andrew, Melissa K
2013-11-29
Prior to the implementation of a new model of care in long-term care facilities in the Capital District Health Authority, Halifax, Nova Scotia, residents entering long-term care were responsible for finding their own family physician. As a result, care was provided by many family physicians responsible for a few residents leading to care coordination and continuity challenges. In 2009, Capital District Health Authority (CDHA) implemented a new model of long-term care called "Care by Design" which includes: a dedicated family physician per floor, 24/7 on-call physician coverage, implementation of a standardized geriatric assessment tool, and an interdisciplinary team approach to care. In addition, a new Emergency Health Services program was implemented shortly after, in which specially trained paramedics dedicated to long-term care responses are able to address urgent care needs. These changes were implemented to improve primary and emergency care for vulnerable residents. Here we describe a comprehensive mixed methods research study designed to assess the impact of these programs on care delivery and resident outcomes. The results of this research will be important to guide primary care policy for long-term care. We aim to evaluate the impact of introducing a new model of a dedicated primary care physician and team approach to long-term care facilities in the CDHA using a mixed methods approach. As a mixed methods study, the quantitative and qualitative data findings will inform each other. Quantitatively we will measure a number of indicators of care in CDHA long-term care facilities pre and post-implementation of the new model. In the qualitative phase of the study we will explore the experience under the new model from the perspectives of stakeholders including family doctors, nurses, administration and staff as well as residents and family members. The proposed mixed method study seeks to evaluate and make policy recommendations related to primary care in long-term care facilities with a focus on end-of-life care and dementia. This is a mixed methods study with concurrent quantitative and qualitative phases. In the quantitative phase, a retrospective time series study is being conducted. Planned analyses will measure indicators of clinical, system, and health outcomes across three time periods and assess the effect of Care by Design as a whole and its component parts. The qualitative methods explore the experiences of stakeholders (ie, physicians, nurses, paramedics, care assistants, administrators, residents, and family members) through focus groups and in depth individual interviews. Data collection will be completed in fall 2013. This study will generate a considerable amount of outcome data with applications for care providers, health care systems, and applications for program evaluation and quality improvement. Using the mixed methods design, this study will provide important results for stakeholders, as well as other health systems considering similar programs. In addition, this study will advance methods used to research new multifaceted interdisciplinary health delivery models using multiple and varied data sources and contribute to the discussion on evidence based health policy and program development.
Singh, Jai
2013-01-01
The objective of this study was a thorough reconsideration, within the framework of Newtonian mechanics and work-energy relationships, of the empirically interpreted relationships employed within the CRASH3 damage analysis algorithm in regards to linearity between barrier equivalent velocity (BEV) or peak collision force magnitude and residual damage depth. The CRASH3 damage analysis algorithm was considered, first in terms of the cases of collisions that produced no residual damage, in order to properly explain the damage onset speed and crush resistance terms. Under the modeling constraints of the collision partners representing a closed system and the a priori assumption of linearity between BEV or peak collision force magnitude and residual damage depth, the equations for the sole realistic model were derived. Evaluation of the work-energy relationships for collisions at or below the elastic limit revealed that the BEV or peak collision force magnitude relationships are bifurcated based upon the residual damage depth. Rather than being additive terms from the linear curve fits employed in the CRASH3 damage analysis algorithm, the Campbell b 0 and CRASH3 AL terms represent the maximum values that can be ascribed to the BEV or peak collision force magnitude, respectively, for collisions that produce zero residual damage. Collisions resulting in the production of non-zero residual damage depth already account for the surpassing of the elastic limit during closure and therefore the secondary addition of the elastic limit terms represents a double accounting of the same. This evaluation shows that the current energy absorbed formulation utilized in the CRASH3 damage analysis algorithm extraneously includes terms associated with the A and G stiffness coefficients. This sole realistic model, however, is limited, secondary to reducing the coefficient of restitution to a constant value for all cases in which the residual damage depth is nonzero. Linearity between BEV or peak collision force magnitude and residual damage depth may be applicable for particular ranges of residual damage depth for any given region of any given vehicle. Within the modeling construct employed by the CRASH3 damage algorithm, the case of uniform and ubiquitous linearity cannot be supported. Considerations regarding the inclusion of internal work recovered and restitution for modeling the separation phase change in velocity magnitude should account for not only the effects present during the evaluation of a vehicle-to-vehicle collision of interest but also to the approach taken for modeling the force-deflection response for each collision partner.
Moore, Lynne; Lavoie, André; Bourgeois, Gilles; Lapointe, Jean
2015-06-01
According to Donabedian's health care quality model, improvements in the structure of care should lead to improvements in clinical processes that should in turn improve patient outcome. This model has been widely adopted by the trauma community but has not yet been validated in a trauma system. The objective of this study was to assess the performance of an integrated trauma system in terms of structure, process, and outcome and evaluate the correlation between quality domains. Quality of care was evaluated for patients treated in a Canadian provincial trauma system (2005-2010; 57 centers, n = 63,971) using quality indicators (QIs) developed and validated previously. Structural performance was measured by transposing on-site accreditation visit reports onto an evaluation grid according to American College of Surgeons criteria. The composite process QI was calculated as the average sum of proportions of conformity to 15 process QIs derived from literature review and expert opinion. Outcome performance was measured using risk-adjusted rates of mortality, complications, and readmission as well as hospital length of stay (LOS). Correlation was assessed with Pearson's correlation coefficients. Statistically significant correlations were observed between structure and process QIs (r = 0.33), and process and outcome QIs (r = -0.33 for readmission, r = -0.27 for LOS). Significant positive correlations were also observed between outcome QIs (r = 0.37 for mortality-readmission; r = 0.39 for mortality-LOS and readmission-LOS; r = 0.45 for mortality-complications; r = 0.34 for readmission-complications; 0.63 for complications-LOS). Significant correlations between quality domains observed in this study suggest that Donabedian's structure-process-outcome model is a valid model for evaluating trauma care. Trauma centers that perform well in terms of structure also tend to perform well in terms of clinical processes, which in turn has a favorable influence on patient outcomes. Prognostic study, level III.
Metz, Thomas; Walewski, Joachim; Kaminski, Clemens F
2003-03-20
Evaluation schemes, e.g., least-squares fitting, are not generally applicable to any types of experiments. If the evaluation schemes were not derived from a measurement model that properly described the experiment to be evaluated, poorer precision or accuracy than attainable from the measured data could result. We outline ways in which statistical data evaluation schemes should be derived for all types of experiment, and we demonstrate them for laser-spectroscopic experiments, in which pulse-to-pulse fluctuations of the laser power cause correlated variations of laser intensity and generated signal intensity. The method of maximum likelihood is demonstrated in the derivation of an appropriate fitting scheme for this type of experiment. Statistical data evaluation contains the following steps. First, one has to provide a measurement model that considers statistical variation of all enclosed variables. Second, an evaluation scheme applicable to this particular model has to be derived or provided. Third, the scheme has to be characterized in terms of accuracy and precision. A criterion for accepting an evaluation scheme is that it have accuracy and precision as close as possible to the theoretical limit. The fitting scheme derived for experiments with pulsed lasers is compared to well-established schemes in terms of fitting power and rational functions. The precision is found to be as much as three timesbetter than for simple least-squares fitting. Our scheme also suppresses the bias on the estimated model parameters that other methods may exhibit if they are applied in an uncritical fashion. We focus on experiments in nonlinear spectroscopy, but the fitting scheme derived is applicable in many scientific disciplines.
An Expert System for the Evaluation of Cost Models
1990-09-01
contrast to the condition of equal error variance, called homoscedasticity. (Reference: Applied Linear Regression Models by John Neter - page 423...normal. (Reference: Applied Linear Regression Models by John Neter - page 125) Click Here to continue -> Autocorrelation Click Here for the index - Index...over time. Error terms correlated over time are said to be autocorrelated or serially correlated. (REFERENCE: Applied Linear Regression Models by John
Long-Term Durability Analysis of a 100,000+ Hr Stirling Power Convertor Heater Head
NASA Technical Reports Server (NTRS)
Bartolotta, Paul A.; Bowman, Randy R.; Krause, David L.; Halford, Gary R.
2000-01-01
DOE and NASA have identified Stirling Radioisotope Power Systems (SRPS) as the power supply for deep space exploration missions the Europa Orbiter and Solar Probe. As a part of this effort, NASA has initiated a long-term durability project for critical hot section components of the Stirling power convertor to qualify flight hardware. This project will develop a life prediction methodology that utilizes short-term (t < 20,000 hr) test data to verify long-term (t > 100,000 hr) design life. The project consists of generating a materials database for the specific heat of alloy, evaluation of critical hermetic sealed joints, life model characterization, and model verification. This paper will describe the qualification methodology being developed and provide a status for this effort.
NASA Astrophysics Data System (ADS)
Leong, W. K.; Lai, S. H.
2017-06-01
Due to the effects of climate change and the increasing demand on water, sustainable development in term of water resources management has become a major challenge. In this context, the application of simulation models is useful to duel with the uncertainty and complexity of water system by providing stakeholders with the best solution. This paper outlines an integrated management planning network is developed based on Water Evaluation and Planning (WEAP) to evaluate current and future water management system of Langat River Basin, Malaysia under various scenarios. The WEAP model is known as an integrated decision support system investigate major stresses on demand and supply in terms of water availability in catchment scale. In fact, WEAP is applicable to simulate complex systems including various sectors within a single catchment or transboundary river system. To construct the model, by taking account of the Langat catchment and the corresponding demand points, we defined the hydrological model into 10 sub-hydrological catchments and 17 demand points included the export of treated water to the major cities outside the catchment. The model is calibrated and verified by several quantitative statistics (coefficient of determination, R2; Nash-Sutcliffe efficiency, NSE and Percent bias, PBIAS). The trend of supply and demand in the catchment is evaluated under three scenarios to 2050, 1: Population growth rate, 2: Demand side management (DSM) and 3: Combination of DSM and reduce non-revenue water (NRW). Results show that by reducing NRW and proper DSM, unmet demand able to reduce significantly.
Whyte, Sophie; Harnan, Susan
2014-06-01
A campaign to increase the awareness of the signs and symptoms of colorectal cancer (CRC) and encourage self-presentation to a GP was piloted in two regions of England in 2011. Short-term data from the pilot evaluation on campaign cost and changes in GP attendances/referrals, CRC incidence, and CRC screening uptake were available. The objective was to estimate the effectiveness and cost-effectiveness of a CRC awareness campaign by using a mathematical model which extrapolates short-term outcomes to predict long-term impacts on cancer mortality, quality-adjusted life-years (QALYs), and costs. A mathematical model representing England (aged 30+) for a lifetime horizon was developed. Long-term changes to cancer incidence, cancer stage distribution, cancer mortality, and QALYs were estimated. Costs were estimated incorporating costs associated with delivering the campaign, additional GP attendances, and changes in CRC treatment. Data from the pilot campaign suggested that the awareness campaign caused a 1-month 10 % increase in presentation rates. Based on this, the model predicted the campaign to cost £5.5 million, prevent 66 CRC deaths and gain 404 QALYs. The incremental cost-effectiveness ratio compared to "no campaign" was £13,496 per QALY. Results were sensitive to the magnitude and duration of the increase in presentation rates and to disease stage. The effectiveness and cost-effectiveness of a cancer awareness campaign can be estimated based on short-term data. Such predictions will aid policy makers in prioritizing between cancer control strategies. Future cost-effectiveness studies would benefit from campaign evaluations reporting as follows: data completeness, duration of impact, impact on emergency presentations, and comparison with non-intervention regions.
Carney, John P; Zhang, Lindsey M; Larson, Jeffrey J; Lahti, Matthew T; Robinson, Nicholas A; Dalmasso, Agustin P; Bianco, Richard W
2017-07-01
Xenograft conduits have been used successfully to repair congenital heart defects, but are prone to failure over time. Hence, in order to improve patient outcomes, better xenografts are being developed. When evaluating a conduit's performance and safety it must first be compared against a clinically available control in a large animal model. The study aim was to evaluate a clinically available xenograft conduit used in right ventricular outflow tract (RVOT) reconstruction in a sheep model. RVOT reconstruction was performed in 13 adult and juvenile sheep, using the Medtronic Hancock® Bioprosthetic Valved Conduit (Hancock conduit). The method had previously been used on patients, and a newly modified variant termed 'RVOT Extraction' was employed to facilitate the surgical procedure. Animals were monitored over predetermined terms of 70 to 140 days. Serial transthoracic echocardiography, intracardiac pressure measurements and angiography were performed. On study completion the animals were euthanized and necropsies performed. Two animals died prior to their designated study term due to severe valvular stenosis and distal conduit narrowing, respectively. Thus, 11 animals survived the study term, with few or no complications. Generally, maximal and mean transvalvular pressure gradients across the implanted conduits were increased throughout the postoperative course. Among 11 full-term animals, seven conduits were patent with mild or no pseudointimal proliferation and with flexible leaflets maintaining the hemodynamic integrity of the valve. RVOT reconstruction using the Hancock conduit was shown to be successful in sheep, with durable and efficient performances. With its extensive clinical use in patients, and ability for long-term use in sheep (as described in the present study) it can be concluded that the Hancock conduit is an excellent control device for the evaluation of new xenografts in future preclinical studies.
Self-Directed Learning in the Process of Work: Conceptual Considerations--Empirical Evidences.
ERIC Educational Resources Information Center
Straka, Gerald A.; Schaefer, Cornelia
With reference to the literature on adult self-directed learning, a model termed the "Two-Shell Model of Motivated Self-Directed Learning" was formulated that differentiates sociohistorical environmental conditions, internal conditions, and activities related to four concepts (interest, learning strategies, control, and evaluation). The…
Schmidt, Wiebke; Evers-King, Hayley L.; Campos, Carlos J. A.; Jones, Darren B.; Miller, Peter I.; Davidson, Keith; Shutler, Jamie D.
2018-01-01
Microbiological contamination or elevated marine biotoxin concentrations within shellfish can result in temporary closure of shellfish aquaculture harvesting, leading to financial loss for the aquaculture business and a potential reduction in consumer confidence in shellfish products. We present a method for predicting short-term variations in shellfish concentrations of Escherichia coli and biotoxin (okadaic acid and its derivates dinophysistoxins and pectenotoxins). The approach was evaluated for 2 contrasting shellfish harvesting areas. Through a meta-data analysis and using environmental data (in situ, satellite observations and meteorological nowcasts and forecasts), key environmental drivers were identified and used to develop models to predict E. coli and biotoxin concentrations within shellfish. Models were trained and evaluated using independent datasets, and the best models were identified based on the model exhibiting the lowest root mean square error. The best biotoxin model was able to provide 1 wk forecasts with an accuracy of 86%, a 0% false positive rate and a 0% false discovery rate (n = 78 observations) when used to predict the closure of shellfish beds due to biotoxin. The best E. coli models were used to predict the European hygiene classification of the shellfish beds to an accuracy of 99% (n = 107 observations) and 98% (n = 63 observations) for a bay (St Austell Bay) and an estuary (Turnaware Bar), respectively. This generic approach enables high accuracy short-term farm-specific forecasts, based on readily accessible environmental data and observations. PMID:29805719
Application of Support Vector Machine to Forex Monitoring
NASA Astrophysics Data System (ADS)
Kamruzzaman, Joarder; Sarker, Ruhul A.
Previous studies have demonstrated superior performance of artificial neural network (ANN) based forex forecasting models over traditional regression models. This paper applies support vector machines to build a forecasting model from the historical data using six simple technical indicators and presents a comparison with an ANN based model trained by scaled conjugate gradient (SCG) learning algorithm. The models are evaluated and compared on the basis of five commonly used performance metrics that measure closeness of prediction as well as correctness in directional change. Forecasting results of six different currencies against Australian dollar reveal superior performance of SVM model using simple linear kernel over ANN-SCG model in terms of all the evaluation metrics. The effect of SVM parameter selection on prediction performance is also investigated and analyzed.
A Reflective Learning Framework to Evaluate CME Effects on Practice Reflection
ERIC Educational Resources Information Center
Leung, Kit H.; Pluye, Pierre; Grad, Roland; Weston, Cynthia
2010-01-01
Introduction: The importance of reflective practice is recognized by the adoption of a reflective learning model in continuing medical education (CME), but little is known about how to evaluate reflective learning in CME. Reflective learning seldom is defined in terms of specific cognitive processes or observable performances. Competency-based…
FEES: design of a Fire Economics Evaluation System
Thomas J. Mills; Frederick W. Bratten
1982-01-01
The Fire Economics Evaluation System (FEES)--a simulation model--is being designed for long-term planning application by all public agencies with wildland fire management responsibilities. A fully operational version of FEES will be capable of estimating the economic efficiency, fire-induced changes in resource outputs, and risk characteristics of a range of fire...
ERIC Educational Resources Information Center
Pittenger, Amy L.
2011-01-01
The purpose of this study was to evaluate the feasibility and effectiveness of implementing interprofessional education to students from six health professional programs through use of an online social networking platform. Specifically, three pedagogical models (Minimally Structured, Facilitated, Highly Structured) were evaluated for impact on…
ERIC Educational Resources Information Center
Stirling, Keith
2000-01-01
Describes a session on information retrieval systems that planned to discuss relevance measures with Web-based information retrieval; retrieval system performance and evaluation; probabilistic independence of index terms; vector-based models; metalanguages and digital objects; how users assess the reliability, timeliness and bias of information;…
ExxonMobil's Social Responsibility Messaging--2002-2013 CEO Letters
ERIC Educational Resources Information Center
Grantham, Susan; Vieira, Edward T., Jr.
2018-01-01
The purpose of this study was to evaluate ExxonMobil's social responsibility/social responsiveness (SR) communication to determine the company's social responsibility messaging in terms of the Social Responsibility themes of the Triple bottom Line Model (profit, people, planet) and to evaluate if the messaging changed in response to external…
DOT National Transportation Integrated Search
2010-09-01
This project focused on the evaluation of traffic sign sheeting performance in terms of meeting the nighttime : driver needs. The goal was to develop a nighttime driver needs specification for traffic signs. The : researchers used nighttime sign legi...
Modeling the turbulent kinetic energy equation for compressible, homogeneous turbulence
NASA Technical Reports Server (NTRS)
Aupoix, B.; Blaisdell, G. A.; Reynolds, William C.; Zeman, Otto
1990-01-01
The turbulent kinetic energy transport equation, which is the basis of turbulence models, is investigated for homogeneous, compressible turbulence using direct numerical simulations performed at CTR. It is shown that the partition between dilatational and solenoidal modes is very sensitive to initial conditions for isotropic decaying turbulence but not for sheared flows. The importance of the dilatational dissipation and of the pressure-dilatation term is evidenced from simulations and a transport equation is proposed to evaluate the pressure-dilatation term evolution. This transport equation seems to work well for sheared flows but does not account for initial condition sensitivity in isotropic decay. An improved model is proposed.
Use of the Box and Jenkins time series technique in traffic forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nihan, N.L.; Holmesland, K.O.
The use of recently developed time series techniques for short-term traffic volume forecasting is examined. A data set containing monthly volumes on a freeway segment for 1968-76 is used to fit a time series model. The resultant model is used to forecast volumes for 1977. The forecast volumes are then compared with actual volumes in 1977. Time series techniques can be used to develop highly accurate and inexpensive short-term forecasts. The feasibility of using these models to evaluate the effects of policy changes or other outside impacts is considered. (1 diagram, 1 map, 14 references,2 tables)
A proposed model for economic evaluations of major depressive disorder.
Haji Ali Afzali, Hossein; Karnon, Jonathan; Gray, Jodi
2012-08-01
In countries like UK and Australia, the comparability of model-based analyses is an essential aspect of reimbursement decisions for new pharmaceuticals, medical services and technologies. Within disease areas, the use of models with alternative structures, type of modelling techniques and/or data sources for common parameters reduces the comparability of evaluations of alternative technologies for the same condition. The aim of this paper is to propose a decision analytic model to evaluate long-term costs and benefits of alternative management options in patients with depression. The structure of the proposed model is based on the natural history of depression and includes clinical events that are important from both clinical and economic perspectives. Considering its greater flexibility with respect to handling time, discrete event simulation (DES) is an appropriate simulation platform for modelling studies of depression. We argue that the proposed model can be used as a reference model in model-based studies of depression improving the quality and comparability of studies.
NASA Astrophysics Data System (ADS)
Sen, O.; Gaul, N. J.; Davis, S.; Choi, K. K.; Jacobs, G.; Udaykumar, H. S.
2018-05-01
Macroscale models of shock-particle interactions require closure terms for unresolved solid-fluid momentum and energy transfer. These comprise the effects of mean as well as fluctuating fluid-phase velocity fields in the particle cloud. Mean drag and Reynolds stress equivalent terms (also known as pseudo-turbulent terms) appear in the macroscale equations. Closure laws for the pseudo-turbulent terms are constructed in this work from ensembles of high-fidelity mesoscale simulations. The computations are performed over a wide range of Mach numbers ( M) and particle volume fractions (φ ) and are used to explicitly compute the pseudo-turbulent stresses from the Favre average of the velocity fluctuations in the flow field. The computed stresses are then used as inputs to a Modified Bayesian Kriging method to generate surrogate models. The surrogates can be used as closure models for the pseudo-turbulent terms in macroscale computations of shock-particle interactions. It is found that the kinetic energy associated with the velocity fluctuations is comparable to that of the mean flow—especially for increasing M and φ . This work is a first attempt to quantify and evaluate the effect of velocity fluctuations for problems of shock-particle interactions.
NASA Astrophysics Data System (ADS)
Sen, O.; Gaul, N. J.; Davis, S.; Choi, K. K.; Jacobs, G.; Udaykumar, H. S.
2018-02-01
Macroscale models of shock-particle interactions require closure terms for unresolved solid-fluid momentum and energy transfer. These comprise the effects of mean as well as fluctuating fluid-phase velocity fields in the particle cloud. Mean drag and Reynolds stress equivalent terms (also known as pseudo-turbulent terms) appear in the macroscale equations. Closure laws for the pseudo-turbulent terms are constructed in this work from ensembles of high-fidelity mesoscale simulations. The computations are performed over a wide range of Mach numbers (M) and particle volume fractions (φ ) and are used to explicitly compute the pseudo-turbulent stresses from the Favre average of the velocity fluctuations in the flow field. The computed stresses are then used as inputs to a Modified Bayesian Kriging method to generate surrogate models. The surrogates can be used as closure models for the pseudo-turbulent terms in macroscale computations of shock-particle interactions. It is found that the kinetic energy associated with the velocity fluctuations is comparable to that of the mean flow—especially for increasing M and φ . This work is a first attempt to quantify and evaluate the effect of velocity fluctuations for problems of shock-particle interactions.
Impact of noise and air pollution on pregnancy outcomes.
Gehring, Ulrike; Tamburic, Lillian; Sbihi, Hind; Davies, Hugh W; Brauer, Michael
2014-05-01
Motorized traffic is an important source of both air pollution and community noise. While there is growing evidence for an adverse effect of ambient air pollution on reproductive health, little is known about the association between traffic noise and pregnancy outcomes. We evaluated the impact of residential noise exposure on small size for gestational age, preterm birth, term birth weight, and low birth weight at term in a population-based cohort study, for which we previously reported associations between air pollution and pregnancy outcomes. We also evaluated potential confounding of air pollution effects by noise and vice versa. Linked administrative health data sets were used to identify 68,238 singleton births (1999-2002) in Vancouver, British Columbia, Canada, with complete covariate data (sex, ethnicity, parity, birth month and year, income, and education) and maternal residential history. We estimated exposure to noise with a deterministic model (CadnaA) and exposure to air pollution using temporally adjusted land-use regression models and inverse distance weighting of stationary monitors for the entire pregnancy. Noise exposure was negatively associated with term birth weight (mean difference = -19 [95% confidence interval = -23 to -15] g per 6 dB(A)). In joint air pollution-noise models, associations between noise and term birth weight remained largely unchanged, whereas associations decreased for all air pollutants. Traffic may affect birth weight through exposure to both air pollution and noise.
Zekry, Dina; Herrmann, François R; Graf, Christophe E; Giannelli, Sandra; Michel, Jean-Pierre; Gold, Gabriel; Krause, Karl-Heinz
2011-01-01
The relative weight of various etiologies of dementia as predictors of long-term mortality after other risk factors have been taken into account remains unclear. We investigated the 5-year mortality risk associated with dementia in elderly people after discharge from acute care, taking into account comorbid conditions and functionality. A prospective cohort study of 444 patients (mean age: 85 years; 74% female) discharged from the acute geriatric unit of Geneva University Hospitals. On admission, each subject underwent a standardized diagnostic evaluation: demographic variables, cognitive, comorbid medical conditions and functional assessment. Patients were followed yearly by the same team. Predictors of survival at 5 years were evaluated by Cox proportional hazards models. The univariate model showed that being older and male, and having vascular and severe dementia, comorbidity and functional disability, were predictive of shorter survival. However, in the full multivariate model adjusted for age and sex, the effect of dementia type or severity completely disappeared when all the variables were added. In multivariate analysis, the best predictor was higher comorbidity score, followed by functional status (R(2) = 23%). The identification of comorbidity and functional impairment effects as predictive factors for long-term mortality independent of cognitive status may increase the accuracy of long-term discharge planning. Copyright © 2011 S. Karger AG, Basel.
Exploring the dynamics of balance data — movement variability in terms of drift and diffusion
NASA Astrophysics Data System (ADS)
Gottschall, Julia; Peinke, Joachim; Lippens, Volker; Nagel, Volker
2009-02-01
We introduce a method to analyze postural control on a balance board by reconstructing the underlying dynamics in terms of a Langevin model. Drift and diffusion coefficients are directly estimated from the data and fitted by a suitable parametrization. The governing parameters are utilized to evaluate balance performance and the impact of supra-postural tasks on it. We show that the proposed method of analysis gives not only self-consistent results but also provides a plausible model for the reconstruction of balance dynamics.
NASA Technical Reports Server (NTRS)
Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.
2013-01-01
NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA evaluate pilot modeling efforts and select the appropriate tools for future modeling efforts to predict pilot performance in NextGen operations.
NASA Astrophysics Data System (ADS)
Senkpiel, Charlotte; Biener, Wolfgang; Shammugam, Shivenes; Längle, Sven
2018-02-01
Energy system models serve as a basis for long term system planning. Joint optimization of electricity generating technologies, storage systems and the electricity grid leads to lower total system cost compared to an approach in which the grid expansion follows a given technology portfolio and their distribution. Modelers often face the problem of finding a good tradeoff between computational time and the level of detail that can be modeled. This paper analyses the differences between a transport model and a DC load flow model to evaluate the validity of using a simple but faster transport model within the system optimization model in terms of system reliability. The main findings in this paper are that a higher regional resolution of a system leads to better results compared to an approach in which regions are clustered as more overloads can be detected. An aggregation of lines between two model regions compared to a line sharp representation has little influence on grid expansion within a system optimizer. In a DC load flow model overloads can be detected in a line sharp case, which is therefore preferred. Overall the regions that need to reinforce the grid are identified within the system optimizer. Finally the paper recommends the usage of a load-flow model to test the validity of the model results.
Airframe noise prediction evaluation
NASA Technical Reports Server (NTRS)
Yamamoto, Kingo J.; Donelson, Michael J.; Huang, Shumei C.; Joshi, Mahendra C.
1995-01-01
The objective of this study is to evaluate the accuracy and adequacy of current airframe noise prediction methods using available airframe noise measurements from tests of a narrow body transport (DC-9) and a wide body transport (DC-10) in addition to scale model test data. General features of the airframe noise from these aircraft and models are outlined. The results of the assessment of two airframe prediction methods, Fink's and Munson's methods, against flight test data of these aircraft and scale model wind tunnel test data are presented. These methods were extensively evaluated against measured data from several configurations including clean, slat deployed, landing gear-deployed, flap deployed, and landing configurations of both DC-9 and DC-10. They were also assessed against a limited number of configurations of scale models. The evaluation was conducted in terms of overall sound pressure level (OASPL), tone corrected perceived noise level (PNLT), and one-third-octave band sound pressure level (SPL).
Louise Loudermilk; Robert Scheller; Peter Weisberg; Jian Yang; Thomas Dilts; Sarah Karam; Carl Skinner
2013-01-01
Understanding how climate change may influence forest carbon (C) budgets requires knowledge of forest growth relationships with regional climate, long-term forest succession, and past and future disturbances, such as wildfires and timber harvesting events. We used a landscape-scale model of forest succession, wildfire, and C dynamics (LANDIS-II) to evaluate the effects...
USDA-ARS?s Scientific Manuscript database
Long-term hydrologic data sets are required to quantify the impacts of management, and climate on runoff at the field scale where management practices are applied. This study was conducted to evaluate the impacts of long-term management and climate on runoff from a small watershed managed with no-ti...
Effects of Short- and Long-Term Changes in Auditory Feedback on Vowel and Sibilant Contrasts
ERIC Educational Resources Information Center
Lane, Harlan; Matthies, Melanie L.; Guenther, Frank H.; Denny, Margaret; Perkell, Joseph S.; Stockmann, Ellen; Tiede, Mark; Vick, Jennell; Zandipour, Majid
2007-01-01
Purpose: To assess the effects of short- and long-term changes in auditory feedback on vowel and sibilant contrasts and to evaluate hypotheses arising from a model of speech motor planning. Method: The perception and production of vowel and sibilant contrasts were measured in 8 postlingually deafened adults prior to activation of their cochlear…
Using logic models in a community-based agricultural injury prevention project.
Helitzer, Deborah; Willging, Cathleen; Hathorn, Gary; Benally, Jeannie
2009-01-01
The National Institute for Occupational Safety and Health has long promoted the logic model as a useful tool in an evaluator's portfolio. Because a logic model supports a systematic approach to designing interventions, it is equally useful for program planners. Undertaken with community stakeholders, a logic model process articulates the underlying foundations of a particular programmatic effort and enhances program design and evaluation. Most often presented as sequenced diagrams or flow charts, logic models demonstrate relationships among the following components: statement of a problem, various causal and mitigating factors related to that problem, available resources to address the problem, theoretical foundations of the selected intervention, intervention goals and planned activities, and anticipated short- and long-term outcomes. This article describes a case example of how a logic model process was used to help community stakeholders on the Navajo Nation conceive, design, implement, and evaluate agricultural injury prevention projects.
Climate change effects on landslides in southern B.C.
NASA Astrophysics Data System (ADS)
Jakob, M.
2009-04-01
Two mechanisms that contribute to the temporal occurrence of landslides in coastal British Columbia are ante¬cedent rainfall and short-term intense rainfall. These two quantities can be extracted from the precipitation regimes simulated by climate models. This makes such models an attractive tool for use in the investigation of the effect of global warming on landslide fre¬quencies. In order to provide some measure of the reliability of models used to address the landslide question, the present-day simulation of the antecedent precipitation and short- term rainfall using the daily data from the Canadian Centre for Climate Modelling and Analysis model (CGCM) is compared to observations along the south coast of British Colum¬bia. This evaluation showed that the model was reasonably successful in simulating sta¬tistics of the antecedent rainfall but was less successful in simulating the short-term rainfall. The monthly mean precipitation data from an ensemble of 19 of the world's global climate models were available to study potential changes in landslide frequencies with global warming. Most of the models were used to produce simulations with three scenar¬ios with different levels of prescribed greenhouse gas concentrations during the twenty-first century. The changes in the antecedent precipitation were computed from the resulting monthly and seasonal means. In order to deal with models' suspected difficulties in sim¬ulating the short-term precipitation and lack of daily data, a statistical procedure was used to relate the short-term precipitation to the monthly means. The qualitative model results agree reasonably well, and when averaged over all models and the three scenarios, the change in the antecedent precipitation is predicted to be about 10% and the change in the short-term precipitation about 6%. Because the antecedent precipitation and the short-term precipitation contribute to the occurrence of landslides, the results of this study support the prediction of increased landslide frequency along the British Columbia south coast during the twenty-first century.
Stephen P. Prisley; Michael J. Mortimer
2004-01-01
Forest modeling has moved beyond the realm of scientific discovery into the policy arena. The example that motivates this review is the application of models for forest carbon accounting. As negotiations determine the terms under which forest carbon will be accounted, reported, and potentially traded, guidelines and standards are being developed to ensure consistency,...
First-Order Frameworks for Managing Models in Engineering Optimization
NASA Technical Reports Server (NTRS)
Alexandrov, Natlia M.; Lewis, Robert Michael
2000-01-01
Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.
NASA Technical Reports Server (NTRS)
Dardner, B. R.; Blad, B. L.; Thompson, D. R.; Henderson, K. E.
1985-01-01
Reflectance and agronomic Thematic Mapper (TM) data were analyzed to determine possible data transformations for evaluating several plant parameters of corn. Three transformation forms were used: the ratio of two TM bands, logarithms of two-band ratios, and normalized differences of two bands. Normalized differences and logarithms of two-band ratios responsed similarly in the equations for estimating the plant growth parameters evaluated in this study. Two-term equations were required to obtain the maximum predictability of percent ground cover, canopy moisture content, and total wet phytomass. Standard error of estimate values were 15-26 percent lower for two-term estimates of these parameters than for one-term estimates. The terms log(TM4/TM2) and (TM4/TM5) produced the maximum predictability for leaf area and dry green leaf weight, respectively. The middle infrared bands TM5 and TM7 are essential for maximizing predictability for all measured plant parameters except leaf area index. The estimating models were evaluated over bare soil to discriminate between equations which are statistically similar. Qualitative interpretations of the resulting prediction equations are consistent with general agronomic and remote sensing theory.
NASA Astrophysics Data System (ADS)
Ichii, K.; Kondo, M.; Ueyama, M.; Kato, T.; Ito, A.; Sasai, T.; Sato, H.; Kobayashi, H.; Saigusa, N.
2014-12-01
Long term record of satellite-based terrestrial vegetation are important to evaluate terrestrial carbon cycle models. In this study, we demonstrate how multiple satellite observation can be used for evaluating past changes in gross primary productivity (GPP) and detecting robust anomalies in terrestrial carbon cycle in Asia through our model-data synthesis analysis, Asia-MIP. We focused on the two different temporal coverages: long-term (30 years; 1982-2011) and decadal (10 years; 2001-2011; data intensive period) scales. We used a NOAA/AVHRR NDVI record for long-term analysis and multiple satellite data and products (e.g. Terra-MODIS, SPOT-VEGETATION) as historical satellite data, and multiple terrestrial carbon cycle models (e.g. BEAMS, Biome-BGC, ORCHIDEE, SEIB-DGVM, and VISIT). As a results of long-term (30 years) trend analysis, satellite-based time-series data showed that approximately 40% of the area has experienced a significant increase in the NDVI, while only a few areas have experienced a significant decreasing trend over the last 30 years. The increases in the NDVI were dominant in the sub-continental regions of Siberia, East Asia, and India. Simulations using the terrestrial biosphere models also showed significant increases in GPP, similar to the results for the NDVI, in boreal and temperate regions. A modeled sensitivity analysis showed that the increases in GPP are explained by increased temperature and precipitation in Siberia. Precipitation, solar radiation, CO2fertilization and land cover changes are important factors in the tropical regions. However, the relative contributions of each factor to GPP changes are different among the models. Year-to-year variations of terrestrial GPP were overall consistently captured by the satellite data and terrestrial carbon cycle models if the anomalies are large (e.g. 2003 summer GPP anomalies in East Asia and 2002 spring GPP anomalies in mid to high latitudes). The behind mechanisms can be consistently explained by the models if the anomalies are caused in the low temperature regions (e.g. spring in Northern Asia). However, water-driven or radiation-driven GPP anomalies lacks consistent explanation among models. Therefore, terrestrial carbon cycle models require improvement of the sensitivity of climate anomalies to carbon cycles.
Medvigy, David; Moorcroft, Paul R
2012-01-19
Terrestrial biosphere models are important tools for diagnosing both the current state of the terrestrial carbon cycle and forecasting terrestrial ecosystem responses to global change. While there are a number of ongoing assessments of the short-term predictive capabilities of terrestrial biosphere models using flux-tower measurements, to date there have been relatively few assessments of their ability to predict longer term, decadal-scale biomass dynamics. Here, we present the results of a regional-scale evaluation of the Ecosystem Demography version 2 (ED2)-structured terrestrial biosphere model, evaluating the model's predictions against forest inventory measurements for the northeast USA and Quebec from 1985 to 1995. Simulations were conducted using a default parametrization, which used parameter values from the literature, and a constrained model parametrization, which had been developed by constraining the model's predictions against 2 years of measurements from a single site, Harvard Forest (42.5° N, 72.1° W). The analysis shows that the constrained model parametrization offered marked improvements over the default model formulation, capturing large-scale variation in patterns of biomass dynamics despite marked differences in climate forcing, land-use history and species-composition across the region. These results imply that data-constrained parametrizations of structured biosphere models such as ED2 can be successfully used for regional-scale ecosystem prediction and forecasting. We also assess the model's ability to capture sub-grid scale heterogeneity in the dynamics of biomass growth and mortality of different sizes and types of trees, and then discuss the implications of these analyses for further reducing the remaining biases in the model's predictions.
Li, Lianfa; Laurent, Olivier; Wu, Jun
2016-02-05
Epidemiological studies suggest that air pollution is adversely associated with pregnancy outcomes. Such associations may be modified by spatially-varying factors including socio-demographic characteristics, land-use patterns and unaccounted exposures. Yet, few studies have systematically investigated the impact of these factors on spatial variability of the air pollution's effects. This study aimed to examine spatial variability of the effects of air pollution on term birth weight across Census tracts and the influence of tract-level factors on such variability. We obtained over 900,000 birth records from 2001 to 2008 in Los Angeles County, California, USA. Air pollution exposure was modeled at individual level for nitrogen dioxide (NO2) and nitrogen oxides (NOx) using spatiotemporal models. Two-stage Bayesian hierarchical non-linear models were developed to (1) quantify the associations between air pollution exposure and term birth weight within each tract; and (2) examine the socio-demographic, land-use, and exposure-related factors contributing to the between-tract variability of the associations between air pollution and term birth weight. Higher air pollution exposure was associated with lower term birth weight (average posterior effects: -14.7 (95 % CI: -19.8, -9.7) g per 10 ppb increment in NO2 and -6.9 (95 % CI: -12.9, -0.9) g per 10 ppb increment in NOx). The variation of the association across Census tracts was significantly influenced by the tract-level socio-demographic, exposure-related and land-use factors. Our models captured the complex non-linear relationship between these factors and the associations between air pollution and term birth weight: we observed the thresholds from which the influence of the tract-level factors was markedly exacerbated or attenuated. Exacerbating factors might reflect additional exposure to environmental insults or lower socio-economic status with higher vulnerability, whereas attenuating factors might indicate reduced exposure or higher socioeconomic status with lower vulnerability. Our Bayesian models effectively combined a priori knowledge with training data to infer the posterior association of air pollution with term birth weight and to evaluate the influence of the tract-level factors on spatial variability of such association. This study contributes new findings about non-linear influences of socio-demographic factors, land-use patterns, and unaccounted exposures on spatial variability of the effects of air pollution.
Parvaneh, Khalil; Shariati, Alireza
2017-09-07
In this study, a new modification of the perturbed chain-statistical associating fluid theory (PC-SAFT) has been proposed by incorporating the lattice fluid theory of Guggenheim as an additional term to the original PC-SAFT terms. As the proposed model has one more term than the PC-SAFT, a new mixing rule has been developed especially for the new additional term, while for the conventional terms of the PC-SAFT, the one-fluid mixing rule is used. In order to evaluate the proposed model, the vapor-liquid equilibria were estimated for binary CO 2 mixtures with 16 different ionic liquids (ILs) of the 1-alkyl-3-methylimidazolium family with various anions consisting of bis(trifluoromethylsulfonyl) imide, hexafluorophosphate, tetrafluoroborate, and trifluoromethanesulfonate. For a comprehensive comparison, three different modes (different adjustable parameters) of the proposed model were compared with the conventional PC-SAFT. Results indicate that the proposed modification of the PC-SAFT EoS is generally more reliable with respect to the conventional PC-SAFT in all the three proposed modes of vapor-liquid equilibria, giving good agreement with literature data.
NASA Astrophysics Data System (ADS)
Bonan, G. B.; Wieder, W. R.
2012-12-01
Decomposition is a large term in the global carbon budget, but models of the earth system that simulate carbon cycle-climate feedbacks are largely untested with respect to litter decomposition. Here, we demonstrate a protocol to document model performance with respect to both long-term (10 year) litter decomposition and steady-state soil carbon stocks. First, we test the soil organic matter parameterization of the Community Land Model version 4 (CLM4), the terrestrial component of the Community Earth System Model, with data from the Long-term Intersite Decomposition Experiment Team (LIDET). The LIDET dataset is a 10-year study of litter decomposition at multiple sites across North America and Central America. We show results for 10-year litter decomposition simulations compared with LIDET for 9 litter types and 20 sites in tundra, grassland, and boreal, conifer, deciduous, and tropical forest biomes. We show additional simulations with DAYCENT, a version of the CENTURY model, to ask how well an established ecosystem model matches the observations. The results reveal large discrepancy between the laboratory microcosm studies used to parameterize the CLM4 litter decomposition and the LIDET field study. Simulated carbon loss is more rapid than the observations across all sites, despite using the LIDET-provided climatic decomposition index to constrain temperature and moisture effects on decomposition. Nitrogen immobilization is similarly biased high. Closer agreement with the observations requires much lower decomposition rates, obtained with the assumption that nitrogen severely limits decomposition. DAYCENT better replicates the observations, for both carbon mass remaining and nitrogen, without requirement for nitrogen limitation of decomposition. Second, we compare global observationally-based datasets of soil carbon with simulated steady-state soil carbon stocks for both models. The models simulations were forced with observationally-based estimates of annual litterfall and model-derived climatic decomposition index. While comparison with the LIDET 10-year litterbag study reveals sharp contrasts between CLM4 and DAYCENT, simulations of steady-state soil carbon show less difference between models. Both CLM4 and DAYCENT significantly underestimate soil carbon. Sensitivity analyses highlight causes of the low soil carbon bias. The terrestrial biogeochemistry of earth system models must be critically tested with observations, and the consequences of particular model choices must be documented. Long-term litter decomposition experiments such as LIDET provide a real-world process-oriented benchmark to evaluate models and can critically inform model development. Analysis of steady-state soil carbon estimates reveal additional, but here different, inferences about model performance.
Harlow C. Landphair
1979-01-01
This paper relates the evolution of an empirical model used to predict public response to scenic quality objectively. The text relates the methods used to develop the visual quality index model, explains the terms used in the equation and briefly illustrates how the model is applied and how it is tested. While the technical application of the model relies heavily on...
Modeling abundance effects in distance sampling
Royle, J. Andrew; Dawson, D.K.; Bates, S.
2004-01-01
Distance-sampling methods are commonly used in studies of animal populations to estimate population density. A common objective of such studies is to evaluate the relationship between abundance or density and covariates that describe animal habitat or other environmental influences. However, little attention has been focused on methods of modeling abundance covariate effects in conventional distance-sampling models. In this paper we propose a distance-sampling model that accommodates covariate effects on abundance. The model is based on specification of the distance-sampling likelihood at the level of the sample unit in terms of local abundance (for each sampling unit). This model is augmented with a Poisson regression model for local abundance that is parameterized in terms of available covariates. Maximum-likelihood estimation of detection and density parameters is based on the integrated likelihood, wherein local abundance is removed from the likelihood by integration. We provide an example using avian point-transect data of Ovenbirds (Seiurus aurocapillus) collected using a distance-sampling protocol and two measures of habitat structure (understory cover and basal area of overstory trees). The model yields a sensible description (positive effect of understory cover, negative effect on basal area) of the relationship between habitat and Ovenbird density that can be used to evaluate the effects of habitat management on Ovenbird populations.
Duguy, Beatriz; Alloza, José Antonio; Baeza, M Jaime; De la Riva, Juan; Echeverría, Maite; Ibarra, Paloma; Llovet, Juan; Cabello, Fernando Pérez; Rovira, Pere; Vallejo, Ramon V
2012-12-01
Forest fires represent a major driver of change at the ecosystem and landscape levels in the Mediterranean region. Environmental features and vegetation are key factors to estimate the ecological vulnerability to fire; defined as the degree to which an ecosystem is susceptible to, and unable to cope with, adverse effects of fire (provided a fire occurs). Given the predicted climatic changes for the region, it is urgent to validate spatially explicit tools for assessing this vulnerability in order to support the design of new fire prevention and restoration strategies. This work presents an innovative GIS-based modelling approach to evaluate the ecological vulnerability to fire of an ecosystem, considering its main components (soil and vegetation) and different time scales. The evaluation was structured in three stages: short-term (focussed on soil degradation risk), medium-term (focussed on changes in vegetation), and coupling of the short- and medium-term vulnerabilities. The model was implemented in two regions: Aragón (inland North-eastern Spain) and Valencia (eastern Spain). Maps of the ecological vulnerability to fire were produced at a regional scale. We partially validated the model in a study site combining two complementary approaches that focused on testing the adequacy of model's predictions in three ecosystems, all very common in fire-prone landscapes of eastern Spain: two shrublands and a pine forest. Both approaches were based on the comparison of model's predictions with values of NDVI (Normalized Difference Vegetation Index), which is considered a good proxy for green biomass. Both methods showed that the model's performance is satisfactory when applied to the three selected vegetation types.
Forecast of drifter trajectories using a Rapid Environmental Assessment based on CTD observations
NASA Astrophysics Data System (ADS)
Sorgente, R.; Tedesco, C.; Pessini, F.; De Dominicis, M.; Gerin, R.; Olita, A.; Fazioli, L.; Di Maio, A.; Ribotti, A.
2016-11-01
A high resolution submesoscale resolving ocean model was implemented in a limited area north of Island of Elba where a maritime exercise, named Serious Game 1 (SG1), took place on May 2014 in the framework of the project MEDESS-4MS (Mediterranean Decision Support System for Marine Safety). During the exercise, CTD data have been collected responding to the necessity of a Rapid Environmental Assessment, i.e. to a rapid evaluation of the marine conditions able to provide sensible information for initialisation of modelling tools, in the scenario of possible maritime accidents. The aim of this paper is to evaluate the impact of such mesoscale-resolving CTD observations on short-term forecasts of the surface currents, within the framework of possible oil-spill related emergencies. For this reason, modelling outputs were compared with Lagrangian observations at sea: the high resolution modelled currents, together with the ones of the coarser sub-regional model WMED, are used to force the MEDSLIK-II oil-spill model to simulate drifter trajectories. Both ocean models have been assessed by comparing the prognostic scalar and vector fields as an independent CTD data set and with real drifter trajectories acquired during SG1. The diagnosed and prognosed circulation reveals that the area was characterised by water masses of Atlantic origin influenced by small mesoscale cyclonic and anti-cyclonic eddies, which govern the spatial and temporal evolution of the drifter trajectories and of the water masses distribution. The assimilation of CTD data into the initial conditions of the high resolution model highly improves the accuracy of the short-term forecast in terms of location and structure of the thermocline and positively influence the ability of the model in reproducing the observed paths of the surface drifters.
Statistical evaluation of substorm strength and onset times in a global MHD model
NASA Astrophysics Data System (ADS)
Haiducek, J. D.; Welling, D. T.; Morley, S.; Ganushkina, N. Y.
2016-12-01
Magnetospheric substorms are characterized by an explosive release of energy stored in the magnetotail, resulting in a tailward plasmoid release, magnetic field perturbations which reach the ground, and a brightening of the aurora. The basic energy release process has been reproduced in magnetohydrodynamic (MHD) models of the global magnetosphere, but previous studies of substorms using MHD have been limited to case studies covering one or a few events. The lack of large-scale validation studies, and the fact that most MHD models rely on numerical or ad-hoc resistivity to produce the reconnection necessary for substorms, has led some to question the suitability of MHD for studying substorms. However, MHD models are able to capture global implications of substorms, including magnetospheric and ionospheric current systems, dipolarizations, and magnetic field perturbations at the surface, providing a compelling motivation to understand and improve substorm physics in global MHD.The present work seeks to assess the capabilities and limitations of MHD with respect to capturing substorms. We identify substorms in long (one month of simulation time) simulations and compare these to observations during the same time period. To reduce the risk of mis-identifying other phenomena as substorms, we use multiple signatures for the identification, including ground-based magnetic field in mid and high latitudes, plasmoid releases, dipolarization signatures, particle injections, and auroral imagery. We evaluate the model in terms of substorm frequency, strength, location, and timing. We model the same time period using the Minimal Substorm Model, which solves an energy balance equation based on solar wind input. This model has been previously shown to produce substorms at a realistic frequency given solar wind conditions; by comparing it to the MHD we are able to assess the relative importance of MHD physics in terms of substorm timing and occurrence rate. We compute a superposed epoch analysis (SEA) of the substorm "hits" (events that occurred in both the model and observations), "misses" (events that occurred only in observations), and false positives. The SEA result serves as a representative scenario with which we evaluate new model configurations in terms of their ability to reproduce substorm dynamics.
Network Virtualization - Opportunities and Challenges for Operators
NASA Astrophysics Data System (ADS)
Carapinha, Jorge; Feil, Peter; Weissmann, Paul; Thorsteinsson, Saemundur E.; Etemoğlu, Çağrı; Ingþórsson, Ólafur; Çiftçi, Selami; Melo, Márcio
In the last few years, the concept of network virtualization has gained a lot of attention both from industry and research projects. This paper evaluates the potential of network virtualization from an operator's perspective, with the short-term goal of optimizing service delivery and rollout, and on a longer term as an enabler of technology integration and migration. Based on possible scenarios for implementing and using network virtualization, new business roles and models are examined. Open issues and topics for further evaluation are identified. In summary, the objective is to identify the challenges but also new opportunities for telecom operators raised by network virtualization.
Remote sensing evaluation of CLM4 GPP for the period 2000 to 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Jiafu; Thornton, Peter E; Shi, Xiaoying
2012-01-01
The ability of a process-based ecosystem model like Version 4 of the Community Land Model (CLM4) to provide accurate estimates of CO2 flux is a top priority for researchers, modelers and policy makers. Remote sensing can provide long-term and large scale products suitable for ecosystem model evaluation. Global estimations of gross primary production (GPP) at the 1 km spatial resolution from years 2000 to 2009 from the MODIS (Moderate Resolution Imaging Spectroradiometer) sensor offer a unique opportunity for evaluating the temporal and spatial patterns of global GPP and its relationship with climate for CLM4. We compare monthly GPP simulated bymore » CLM4 at half-degree resolution with satellite estimates of GPP from the MODIS GPP (MOD17) dataset for the 10-yr period, January 2000 December 2009. The assessment is presented in terms of long-term mean carbon assimilation, seasonal mean distributions, amplitude and phase of the annual cycle, and intra-annual and inter-annual GPP variability and their responses to climate variables. For the long-term annual and seasonal means, major GPP patterns are clearly demonstrated by both products. Compared to the MODIS product, CLM4 overestimates the magnitude of GPP for tropical evergreen forests. CLM4 has longer carbon uptake period than MODIS for most plant functional types (PFTs) with an earlier onset of GPP in spring and later decline of GPP in autumn. Empirical Orthogonal Function (EOF) analysis of the monthly GPP changes indicates that on the intra-annual scale, both CLM4 and MODIS display similar spatial representations and temporal patterns for most terrestrial ecosystems except in northeast Russia and the very dry region in central Australia. For 2000-2009, CLM4 simulates increases in annual averaged GPP over both hemispheres, however estimates from MODIS suggest a reduction in the Southern Hemisphere (-0.2173 PgC/year) balancing the significant increase over the Northern Hemisphere (0.2157 PgC/year).« less
NASA Astrophysics Data System (ADS)
KIM, J.; Bastidas, L. A.
2011-12-01
We evaluate, calibrate and diagnose the performance of National Weather Service RDHM distributed model over the Durango River Basin in Colorado using simultaneously in situ and remotely sensed information from different discharge gaging stations (USGS), information about snow cover (SCV) and snow water equivalent (SWE) in situ from several SNOTEL sites and snow information distributed over the catchment from remotely sensed information (NOAA-NASA). In the process of evaluation we attempt to establish the optimal degree of parameter distribution over the catchment by calibration. A multi-criteria approach based on traditional measures (RMSE) and similarity based pattern comparisons using the Hausdorff and Earth Movers Distance approaches is used for the overall evaluation of the model performance. These pattern based approaches (shape matching) are found to be extremely relevant to account for the relatively large degree of inaccuracy in the remotely sensed SWE (judged inaccurate in terms of the value but reliable in terms of the distribution pattern) and the high reliability of the SCV (yes/no situation) while at the same time allow for an evaluation that quantifies the accuracy of the model over the entire catchment considering the different types of observations. The Hausdorff norm, due to its intrinsically multi-dimensional nature, allows for the incorporation of variables such as the terrain elevation as one of the variables for evaluation. The EMD, because of its extremely high computational overburden, requires the mapping of the set of evaluation variables into a two dimensional matrix for computation.
Approximate inference on planar graphs using loop calculus and belief progagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael; Gomez, Vicenc; Kappen, Hilbert
We introduce novel results for approximate inference on planar graphical models using the loop calculus framework. The loop calculus (Chertkov and Chernyak, 2006b) allows to express the exact partition function Z of a graphical model as a finite sum of terms that can be evaluated once the belief propagation (BP) solution is known. In general, full summation over all correction terms is intractable. We develop an algorithm for the approach presented in Chertkov et al. (2008) which represents an efficient truncation scheme on planar graphs and a new representation of the series in terms of Pfaffians of matrices. We analyzemore » in detail both the loop series and the Pfaffian series for models with binary variables and pairwise interactions, and show that the first term of the Pfaffian series can provide very accurate approximations. The algorithm outperforms previous truncation schemes of the loop series and is competitive with other state-of-the-art methods for approximate inference.« less
ERIC Educational Resources Information Center
Ahtee, Maija, Ed.; And Others
The main purpose of this symposium was to find new ideas and resources for the evaluation and improvement of physics education on all levels. The papers included in this document are entitled: (1) "Quality of Physics Teaching Through Building Models and Advancing Research Skills"; (2) "Evaluation of Physics Education in Terms of Its…
ERIC Educational Resources Information Center
Dodd, Carol Ann
This study explores a technique for evaluating teacher education programs in terms of teaching competencies, as applied to the Indiana University Mathematics Methods Program (MMP). The evaluation procedures formulated for the study include a process product design in combination with a modification of Pophan's performance test paradigm and Gage's…
Hong, Eun-Mi; Shelton, Daniel; Pachepsky, Yakov A; Nam, Won-Ho; Coppock, Cary; Muirhead, Richard
2017-02-01
Knowledge of the microbial quality of irrigation waters is extremely limited. For this reason, the US FDA has promulgated the Produce Rule, mandating the testing of irrigation water sources for many farms. The rule requires the collection and analysis of at least 20 water samples over two to four years to adequately evaluate the quality of water intended for produce irrigation. The objective of this work was to evaluate the effect of interannual weather variability on surface water microbial quality. We used the Soil and Water Assessment Tool model to simulate E. coli concentrations in the Little Cove Creek; this is a perennial creek located in an agricultural watershed in south-eastern Pennsylvania. The model performance was evaluated using the US FDA regulatory microbial water quality metrics of geometric mean (GM) and the statistical threshold value (STV). Using the 90-year time series of weather observations, we simulated and randomly sampled the time series of E. coli concentrations. We found that weather conditions of a specific year may strongly affect the evaluation of microbial quality and that the long-term assessment of microbial water quality may be quite different from the evaluation based on short-term observations. The variations in microbial concentrations and water quality metrics were affected by location, wetness of the hydrological years, and seasonality, with 15.7-70.1% of samples exceeding the regulatory threshold. The results of this work demonstrate the value of using modeling to design and evaluate monitoring protocols to assess the microbial quality of water used for produce irrigation. Copyright © 2016 Elsevier Ltd. All rights reserved.
The FIM-iHYCOM Model in SubX: Evaluation of Subseasonal Errors and Variability
NASA Astrophysics Data System (ADS)
Green, B.; Sun, S.; Benjamin, S.; Grell, G. A.; Bleck, R.
2017-12-01
NOAA/ESRL/GSD has produced both real-time and retrospective forecasts for the Subseasonal Experiment (SubX) using the FIM-iHYCOM model. FIM-iHYCOM couples the atmospheric Flow-following finite volume Icosahedral Model (FIM) to an icosahedral-grid version of the Hybrid Coordinate Ocean Model (HYCOM). This coupled model is unique in terms of its grid structure: in the horizontal, the icosahedral meshes are perfectly matched for FIM and iHYCOM, eliminating the need for a flux interpolator; in the vertical, both models use adaptive arbitrary Lagrangian-Eulerian hybrid coordinates. For SubX, FIM-iHYCOM initializes four time-lagged ensemble members around each Wednesday, which are integrated forward to provide 32-day forecasts. While it has already been shown that this model has similar predictive skill as NOAA's operational CFSv2 in terms of the RMM index, FIM-iHYCOM is still fairly new and thus its overall performance needs to be thoroughly evaluated. To that end, this study examines model errors as a function of forecast lead week (1-4) - i.e., model drift - for key variables including 2-m temperature, precipitation, and SST. Errors are evaluated against two reanalysis products: CFSR, from which FIM-iHYCOM initial conditions are derived, and the quasi-independent ERA-Interim. The week 4 error magnitudes are similar between FIM-iHYCOM and CFSv2, albeit with different spatial distributions. Also, intraseasonal variability as simulated in these two models will be compared with reanalyses. The impact of hindcast frequency (4 times per week, once per week, or once per day) on the model climatology is also examined to determine the implications for systematic error correction in FIM-iHYCOM.
A multi-scalar PDF approach for LES of turbulent spray combustion
NASA Astrophysics Data System (ADS)
Raman, Venkat; Heye, Colin
2011-11-01
A comprehensive joint-scalar probability density function (PDF) approach is proposed for large eddy simulation (LES) of turbulent spray combustion and tests are conducted to analyze the validity and modeling requirements. The PDF method has the advantage that the chemical source term appears closed but requires models for the small scale mixing process. A stable and consistent numerical algorithm for the LES/PDF approach is presented. To understand the modeling issues in the PDF method, direct numerical simulation of a spray flame at three different fuel droplet Stokes numbers and an equivalent gaseous flame are carried out. Assumptions in closing the subfilter conditional diffusion term in the filtered PDF transport equation are evaluated for various model forms. In addition, the validity of evaporation rate models in high Stokes number flows is analyzed.
Validation of a Preclinical Spinal Safety Model: Effects of Intrathecal Morphine in the Neonatal Rat
Westin, B. David; Walker, Suellen M.; Deumens, Ronald; Grafe, Marjorie; Yaksh, Tony L.
2010-01-01
Background Preclinical studies demonstrate increased neuroapoptosis after general anesthesia in early life. Neuraxial techniques may minimize potential risks, but there has been no systematic evaluation of spinal analgesic safety in developmental models. We aimed to validate a preclinical model for evaluating dose-dependent efficacy, spinal cord toxicity, and long term function following intrathecal morphine in the neonatal rat. Methods Lumbar intrathecal injections were performed in anesthetized rats aged postnatal day (P)3, 10 and 21. The relationship between injectate volume and segmental spread was assessed post mortem and by in-vivo imaging. To determine the antinociceptive dose, mechanical withdrawal thresholds were measured at baseline and 30 minutes following intrathecal morphine. To evaluate toxicity, doses up to the maximum tolerated were administered, and spinal cord histopathology, apoptosis and glial response were evaluated 1 and 7 days following P3 or P21 injection. Sensory thresholds and gait analysis were evaluated at P35. Results Intrathecal injection can be reliably performed at all postnatal ages and injectate volume influences segmental spread. Intrathecal morphine produced spinally-mediated analgesia at all ages with lower dose requirements in younger pups. High dose intrathecal morphine did not produce signs of spinal cord toxicity or alter long-term function. Conclusions The therapeutic ratio for intrathecal morphine (toxic dose / antinociceptive dose) was at least 300 at P3, and at least 20 at P21 (latter doses limited by side effects). This data provides relative efficacy and safety data for comparison with other analgesic preparations and contributes supporting evidence for the validity of this preclinical neonatal safety model. PMID:20526189
Westin, B David; Walker, Suellen M; Deumens, Ronald; Grafe, Marjorie; Yaksh, Tony L
2010-07-01
Preclinical studies demonstrate increased neuroapoptosis after general anesthesia in early life. Neuraxial techniques may minimize potential risks, but there has been no systematic evaluation of spinal analgesic safety in developmental models. We aimed to validate a preclinical model for evaluating dose-dependent efficacy, spinal cord toxicity, and long-term function after intrathecal morphine in the neonatal rat. Lumbar intrathecal injections were performed in anesthetized rats aged postnatal day (P) 3, 10, and 21. The relationship between injectate volume and segmental spread was assessed postmortem and by in vivo imaging. To determine the antinociceptive dose, mechanical withdrawal thresholds were measured at baseline and 30 min after intrathecal morphine. To evaluate toxicity, doses up to the maximum tolerated were administered, and spinal cord histopathology, apoptosis, and glial response were evaluated 1 and 7 days after P3 or P21 injection. Sensory thresholds and gait analysis were evaluated at P35. Intrathecal injection can be reliably performed at all postnatal ages and injectate volume influences segmental spread. Intrathecal morphine produced spinally mediated analgesia at all ages with lower dose requirements in younger pups. High-dose intrathecal morphine did not produce signs of spinal cord toxicity or alter long-term function. The therapeutic ratio for intrathecal morphine (toxic dose/antinociceptive dose) was at least 300 at P3 and at least 20 at P21 (latter doses limited by side effects). These data provide relative efficacy and safety for comparison with other analgesic preparations and contribute supporting evidence for the validity of this preclinical neonatal safety model.
Source term evaluation for combustion modeling
NASA Technical Reports Server (NTRS)
Sussman, Myles A.
1993-01-01
A modification is developed for application to the source terms used in combustion modeling. The modification accounts for the error of the finite difference scheme in regions where chain-branching chemical reactions produce exponential growth of species densities. The modification is first applied to a one-dimensional scalar model problem. It is then generalized to multiple chemical species, and used in quasi-one-dimensional computations of shock-induced combustion in a channel. Grid refinement studies demonstrate the improved accuracy of the method using this modification. The algorithm is applied in two spatial dimensions and used in simulations of steady and unsteady shock-induced combustion. Comparisons with ballistic range experiments give confidence in the numerical technique and the 9-species hydrogen-air chemistry model.
DOT National Transportation Integrated Search
2017-06-01
The purpose of this study was to evaluate if the Surrogate Safety Assessment Model (SSAM) could be used to assess the safety of a highway segment or an intersection in terms of the number and type of conflicts and to compare the safety effects of mul...
DOT National Transportation Integrated Search
1978-02-01
Ride-quality models for city buses and intercity trains are presented and discussed in terms of their ability to predict passenger comfort and ride acceptability. The report, the last of three volumes, contains procedural guidelines to be employed by...
Self-Monitoring Success and Failure: Evidence for a Mediating Mechanism.
ERIC Educational Resources Information Center
Susser, Howard S.
Two theories, the closed loop model (divides self-regulation into self-monitoring, self-evaluation, and self-reinforcement) and the non-mediational model (defines self-regulation as behavior that is controlled by its long-term and observable consequences), have been proposed to explain why behavior changes when self-monitoring occurs. Both…
NASA Astrophysics Data System (ADS)
Ma, W.; Ma, Y.; Hu, Z.; Zhong, L.
2017-12-01
In this study, a land-atmosphere model was initialized by ingesting AMSR-E products, and the results were compared with the default model configuration and with in situ long-term CAMP/Tibet observations. Firstly our field observation sites will be introduced based on ITPCAS (Institute of Tibetan Plateau Research, Chinese Academy of Sciences). Then, a land-atmosphere model was initialized by ingesting AMSR-E products, and the results were compared with the default model configuration and with in situ long-term CAMP/Tibet observations. The differences between the AMSR-E initialized model runs with the default model configuration and in situ data showed an apparent inconsistency in the model-simulated land surface heat fluxes. The results showed that the soil moisture was sensitive to the specific model configuration. To evaluate and verify the model stability, a long-term modeling study with AMSR-E soil moisture data ingestion was performed. Based on test simulations, AMSR-E data were assimilated into an atmospheric model for July and August 2007. The results showed that the land surface fluxes agreed well with both the in situ data and the results of the default model configuration. Therefore, the simulation can be used to retrieve land surface heat fluxes from an atmospheric model over the Tibetan Plateau.
NASA Astrophysics Data System (ADS)
Seneca, S. M.; Rabideau, A. J.; Bandilla, K.
2010-12-01
Experimental and modeling studies are in progress to evaluate the long-term performance of a permeable treatment wall comprised of zeolite-rich rock for the removal of strontium-90 from groundwater. Multiple column tests were performed at the University at Buffalo and on-site West Valley Environmental Services; columns were supplied with synthetic groundwater referenced to anticipate field conditions and radioactive groundwater on-site WVES. The primary focus in this work is on quantifying the competitive ion exchange among five cations (Na+, K+, Ca2+, Mg2+, and Sr2+); the data obtained from the column studies is used to support the robust estimation of zeolite cation exchange parameters. This research will produce a five-solute cation exchange model describing the removal efficiency of the zeolite, using the various column tests to calibrate and validate the geochemical transport model. The field-scale transport model provides flexibility to explore design parameters and potential variations in groundwater geochemistry to investigate the long-term performance of a full scale treatment wall at the Western New York nuclear facility.
He, Yong; Hou, Lingling; Wang, Hong; Hu, Kelin; McConkey, Brian
2014-07-30
Soil surface texture is an important environmental factor that influences crop productivity because of its direct effect on soil water and complex interactions with other environmental factors. Using 30-year data, an agricultural system model (DSSAT-CERES-Wheat) was calibrated and validated. After validation, the modelled yield and water use (WU) of spring wheat (Triticum aestivum L.) from two soil textures (silt loam and clay) under rain-fed condition were analyzed. Regression analysis showed that wheat grown in silt loam soil is more sensitive to WU than wheat grown in clay soil, indicating that the wheat grown in clay soil has higher drought tolerance than that grown in silt loam. Yield variation can be explained by WU other than by precipitation use (PU). These results demonstrated that the DSSAT-CERES-Wheat model can be used to evaluate the WU of different soil textures and assess the feasibility of wheat production under various conditions. These outcomes can improve our understanding of the long-term effect of soil texture on spring wheat productivity in rain-fed condition.
Simulating eroded soil organic carbon with the SWAT-C model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xuesong
The soil erosion and associated lateral movement of eroded carbon (C) have been identified as a possible mechanism explaining the elusive terrestrial C sink of ca. 1.7-2.6 PgC yr(-1). Here we evaluated the SWAT-C model for simulating long-term soil erosion and associated eroded C yields. Our method couples the CENTURY carbon cycling processes with a Modified Universal Soil Loss Equation (MUSLE) to estimate C losses associated with soil erosion. The results show that SWAT-C is able to simulate well long-term average eroded C yields, as well as correctly estimate the relative magnitude of eroded C yields by crop rotations. Wemore » also evaluated three methods of calculating C enrichment ratio in mobilized sediments, and found that errors associated with enrichment ratio estimation represent a significant uncertainty in SWAT-C simulations. Furthermore, we discussed limitations and future development directions for SWAT-C to advance C cycling modeling and assessment.« less
Deep Recurrent Neural Network-Based Autoencoders for Acoustic Novelty Detection
Vesperini, Fabio; Schuller, Björn
2017-01-01
In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are predicted from the previous frames by means of Long-Short Term Memory recurrent denoising autoencoders. The reconstruction error between the input and the output of the autoencoder is used as activation signal to detect novel events. There is no evidence of studies focused on comparing previous efforts to automatically recognize novel events from audio signals and giving a broad and in depth evaluation of recurrent neural network-based autoencoders. The present contribution aims to consistently evaluate our recent novel approaches to fill this white spot in the literature and provide insight by extensive evaluations carried out on three databases: A3Novelty, PASCAL CHiME, and PROMETHEUS. Besides providing an extensive analysis of novel and state-of-the-art methods, the article shows how RNN-based autoencoders outperform statistical approaches up to an absolute improvement of 16.4% average F-measure over the three databases. PMID:28182121
Goldhaber-Fiebert, Jeremy D.; Snowden, Lonnie R.; Wulczyn, Fred; Landsverk, John; Horwitz, Sarah M.
2011-01-01
Objectives With over 1 million children served by the U.S. Child Welfare system at a cost of $20 billion annually, this study examines the economic evaluation literature on interventions to improve outcomes for children at risk for and currently involved with the system, identifies areas where additional research is needed, and discusses the use of decision-analytic modeling to advance Child Welfare policy and practice. Methods The review included 19 repositories of peer-reviewed and non-peer-reviewed “gray” literatures, including items in English published before November, 2009. Original research articles were included if they evaluated interventions based on costs and outcomes. Review articles were included to assess the relevance of these techniques over time and to highlight the increasing discussion of methods needed to undertake such research. Items were categorized by their focus on: interventions for the U.S. Child Welfare system; primary prevention of entry into the system; and use of models to make long-term projections of costs and outcomes. Results Searches identified 2,640 articles, with 49 ultimately included (19 reviews and 30 original research articles). Between 1988 and 2009, reviews consistently advocated economic evaluation and increasingly provided methodological guidance. 21 of the original research articles focused on Child Welfare, while 9 focused on child mental health. Of the 21 Child Welfare articles, 81% (17) focused on the U.S. system. 47% (8/17) focused exclusively on primary prevention, though 83% of the U.S. system, peer-reviewed articles focused exclusively on prevention (5/6). 9 of the 17 articles included empirical follow-up (mean sample size: 264 individuals; mean follow-up: 3.8 years). 10 of the 17 articles used modeling to project longer-term outcomes, but 80% of the articles using modeling were not peer-reviewed. Although 60% of modeling studies included interventions for children in the system, all peer-reviewed modeling articles focused on prevention. Conclusions Methodological guidance for economic evaluations in Child Welfare is increasingly available. Such analyses are feasible given the availability of nationally-representative data on children involved with Child Welfare and evidence-based interventions. Practice Implications Policy analyses considering the long-term costs and effects of interventions to improve Child Welfare outcomes are scarce, feasible, and urgently needed. PMID:21944552
ERIC Educational Resources Information Center
Christie, Lu S.; McKenzie, Hugh S.
Discussed is the use of minimum behavioral objectives to provide evaluation of special education in regular classrooms. Literature which supports the mainstreaming of moderately handicapped children is reviewed briefly. Application of the behavioral model of education on the community level is considered in terms of the basic skills which comprise…
ERIC Educational Resources Information Center
Oakland, Thomas
New strategies for evaluation criterion referenced measures (CRM) are discussed. These strategies examine the following issues: (1) the use of normed referenced measures (NRM) as CRM and then estimating the reliability and validity of such measures in terms of variance from an arbitrarily specified criterion score, (2) estimation of the…
DOT National Transportation Integrated Search
2010-09-01
This report presents data and technical analyses for Texas Department of Transportation Project 0-5235. This : project focused on the evaluation of traffic sign sheeting performance in terms of meeting the nighttime : driver needs. The goal was to de...
ERIC Educational Resources Information Center
Gunn, Patrick; Loy, Dan
2015-01-01
Effectively measuring short-term impact, particularly a change in knowledge resulting from Extension programming, can prove to be challenging. Clicker-based technology, when used properly, is one alternative that may allow educators to better evaluate this aspect of the logic model. While the potential interface between clicker technology and…
Quasi-elastic nuclear scattering at high energies
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Townsend, Lawrence W.; Wilson, John W.
1992-01-01
The quasi-elastic scattering of two nuclei is considered in the high-energy optical model. Energy loss and momentum transfer spectra for projectile ions are evaluated in terms of an inelastic multiple-scattering series corresponding to multiple knockout of target nucleons. The leading-order correction to the coherent projectile approximation is evaluated. Calculations are compared with experiments.
Zong Bo Shang; Hong S. He; Weimin Xi; Stephen R. Shifley; Brian J. Palik
2012-01-01
Public forest management requires consideration of numerous objectives including protecting ecosystem health, sustaining habitats for native communities, providing sustainable forest products, and providing noncommodity ecosystem services. It is difficult to evaluate the long-term, cumulative effects and tradeoffs these and other associated management objectives. To...
Modeling Renewable Penertration Using a Network Economic Model
NASA Astrophysics Data System (ADS)
Lamont, A.
2001-03-01
This paper evaluates the accuracy of a network economic modeling approach in designing energy systems having renewable and conventional generators. The network approach models the system as a network of processes such as demands, generators, markets, and resources. The model reaches a solution by exchanging prices and quantity information between the nodes of the system. This formulation is very flexible and takes very little time to build and modify models. This paper reports an experiment designing a system with photovoltaic and base and peak fossil generators. The level of PV penetration as a function of its price and the capacities of the fossil generators were determined using the network approach and using an exact, analytic approach. It is found that the two methods agree very closely in terms of the optimal capacities and are nearly identical in terms of annual system costs.
Yap, Keong; Gibbs, Amy L; Francis, Andrew J P; Schuster, Sharynn E
2016-01-01
The Bivalent Fear of Evaluation (BFOE) model of social anxiety proposes that fear of negative evaluation (FNE) and fear of positive evaluation (FPE) play distinct roles in social anxiety. Research is however lacking in terms of how FPE is related to perfectionism and how these constructs interact to predict social anxiety. Participants were 382 individuals from the general community and included an oversampling of individuals with social anxiety. Measures of FPE, FNE, perfectionism, and social anxiety were administered. Results were mostly consistent with the predictions made by the BFOE model and showed that accounting for confounding variables, FPE correlated negatively with high standards but positively with maladaptive perfectionism. FNE was also positively correlated with maladaptive perfectionism, but there was no significant relationship between FNE and high standards. Also consistent with BFOE model, both FNE and FPE significantly moderated the relationship between maladaptive perfectionism and social anxiety with the relationship strengthened at high levels of FPE and FNE. These findings provide additional support for the BFOE model and implications are discussed.
Multi-regime transport model for leaching behavior of heterogeneous porous materials.
Sanchez, F; Massry, I W; Eighmy, T; Kosson, D S
2003-01-01
Utilization of secondary materials in civil engineering applications (e.g. as substitutes for natural aggregates or binder constituents) requires assessment of the physical and environment properties of the product. Environmental assessment often necessitates evaluation of the potential for constituent release through leaching. Currently most leaching models used to estimate long-term field performance assume that the species of concern is uniformly dispersed in a homogeneous porous material. However, waste materials are often comprised of distinct components such as coarse or fine aggregates in a cement concrete or waste encapsulated in a stabilized matrix. The specific objectives of the research presented here were to (1) develop a one-dimensional, multi-regime transport model (i.e. MRT model) to describe the release of species from heterogeneous porous materials and, (2) evaluate simple limit cases using the model for species when release is not dependent on pH. Two different idealized model systems were considered: (1) a porous material contaminated with the species of interest and containing inert aggregates and, (2) a porous material containing the contaminant of interest only in the aggregates. The effect of three factors on constituent release were examined: (1) volume fraction of material occupied by the aggregates compared to a homogeneous porous material, (2) aggregate size and, (3) differences in mass transfer rates between the binder and the aggregates. Simulation results confirmed that assuming homogeneous materials to evaluate the release of contaminants from porous waste materials may result in erroneous long-term field performance assessment.
NASA Astrophysics Data System (ADS)
Anderson, O. Roger
The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.
ERIC Educational Resources Information Center
Hacker, Douglas J.; Dole, Janice A.; Ferguson, Monica; Adamson, Sharon; Roundy, Linda; Scarpulla, Laura
2015-01-01
Our purpose for this quasi-experimental study was to evaluate the short-term and maintenance effects of the self-regulated strategy development writing instructional model by Graham and Harris with 7th-grade students in an urban, ethnically diverse Title I middle school. We compared the writing skills of our intervention students with those of…
Long-term hydrology and water quality of a drained pine plantation in North Carolina
D.M. Amatya; R.W. Skaggs
2011-01-01
Long-term data provide a basis for understanding natural variability, reducing uncertainty in model inputs and parameter estimation, and developing new hypotheses. This article evaluates 21 years (1988-2008) of hydrologic data and 17 years (1988-2005) of water quality data from a drained pine plantation in eastern North Carolina. The plantation age was 14 years at the...
ERIC Educational Resources Information Center
Hampton, Scott E.
Research has shown that student mid-term feedback has significantly increased subsequent ratings of teacher effectiveness, student achievement, and student attitudes when the feedback results were accompanied by expert consultation. A gap in the literature is an instrument intended to provide specific feedback on systematic planning and delivery…
Investigation into Text Classification With Kernel Based Schemes
2010-03-01
Document Matrix TDMs Term-Document Matrices TMG Text to Matrix Generator TN True Negative TP True Positive VSM Vector Space Model xxii THIS PAGE...are represented as a term-document matrix, common evaluation metrics, and the software package Text to Matrix Generator ( TMG ). The classifier...AND METRICS This chapter introduces the indexing capabilities of the Text to Matrix Generator ( TMG ) Toolbox. Specific attention is placed on the
Information Warfare: Evaluation of Operator Information Processing Models
1997-10-01
that people can describe or report, including both episodic and semantic information. Declarative memory contains a network of knowledge represented...second dimension corresponds roughly to the distinction between episodic and semantic memory that is commonly made in cognitive psychology. Episodic ...3 is long-term memory for the discourse, a subset of episodic memory . Partition 4 is long-term semantic memory , or the knowledge-base. According to
Kristofer Johnson; Frederick N. Scatena; Yude Pan
2010-01-01
The long-term response of total soil organic carbon pools ('total SOC', i.e. soil and dead wood) to different harvesting scenarios in even-aged northern hardwood forest stands was evaluated using two soil carbon models, CENTURY and YASSO, that were calibrated with forest plot empirical data in the Green Mountains of Vermont. Overall, 13 different harvesting...
Dachir, Shlomit; Cohen, Maayan; Kamus-Elimeleh, Dikla; Fishbine, Eliezer; Sahar, Rita; Gez, Rellie; Brandeis, Rachel; Horwitz, Vered; Kadar, Tamar
2012-01-01
Sulfur mustard induces severe acute and prolonged damage to the skin and only partially effective treatments are available. We have previously validated the use of hairless guinea pigs as an experimental model for skin lesions. The present study aimed to characterize a model of a deep dermal lesion and to compare it with the previously described superficial lesion. Clinical evaluation of the lesions was conducted using reflectance colorimetry, trans-epidermal water loss and wound area measurements. Prostaglandin E(2) content, matrix metalloproteinase-2 and 9 activity, and histopathology were conducted up to 4 weeks post-exposure. Sulfur mustard skin injury, including erythema and edema, impairment of skin barrier and wounds developed in a dose-dependent manner. Prostaglandin E(2) content and matrix metalloproteinase-2 and 9 activities were elevated during the wound development and the healing process. Histological evaluation revealed severe damage to the epidermis and deep dermis and vesications. At 4 weeks postexposure, healing was not completed: significantly impaired stratum corneum, absence of hair follicles, and epidermal hyperplasia were observed. These results confirm the use of the superficial and deep dermal skin injuries in the hairless guinea pigs as suitable models that can be utilized for the investigation of the pathological processes of acute as well as long-term injuries. These models will be further used to develop treatments to improve the healing process and prevent skin damage and long-term effects. © 2012 by the Wound Healing Society.
Two Decades of WRF/CMAQ simulations over the continental ...
Confidence in the application of models for forecasting and regulatory assessments is furthered by conducting four types of model evaluation: operational, dynamic, diagnostic, and probabilistic. Operational model evaluation alone does not reveal the confidence limits that can be associated with modeled air quality concentrations. This paper presents novel approaches for performing dynamic model evaluation and for evaluating the confidence limits of ozone exceedances using the WRF/CMAQ model simulations over the continental United States for the period from 1990 to 2010. The methodology presented here entails spectral decomposition of ozone time series using the KZ filter to assess the variations in the strengths of the synoptic (i.e., weather-induced variation) and baseline (i.e., long-term variation attributable to emissions, policy, and trends) forcings embedded in the modeled and observed concentrations. A method is presented where the future year observations are estimated based on the changes in the concentrations predicted by the model applied to the current year observations. The proposed method can provide confidence limits for ozone exceedances for a given emission reduction scenario. We present and discuss these new approaches to identify the strengths of the model in representing the changes in simulated O3 air quality over the 21-year period. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates
LOX/hydrocarbon auxiliary propulsion system study
NASA Technical Reports Server (NTRS)
Orton, G. F.; Mark, T. D.; Weber, D. D.
1982-01-01
Liquid oxygen/hydrocarbon propulsion systems applicable to a second generation orbiter OMS/RCS were compared, and major system/component options were evaluated. A large number of propellant combinations and system concepts were evaluated. The ground rules were defined in terms of candidate propellants, system/component design options, and design requirements. System and engine component math models were incorporated into existing computer codes for system evaluations. The detailed system evaluations and comparisons were performed to identify the recommended propellant combination and system approach.
Evaluation of the UnTRIM model for 3-D tidal circulation
Cheng, R.T.; Casulli, V.; ,
2001-01-01
A family of numerical models, known as the TRIM models, shares the same modeling philosophy for solving the shallow water equations. A characteristic analysis of the shallow water equations points out that the numerical instability is controlled by the gravity wave terms in the momentum equations and by the transport terms in the continuity equation. A semi-implicit finite-difference scheme has been formulated so that these terms and the vertical diffusion terms are treated implicitly and the remaining terms explicitly to control the numerical stability and the computations are carried out over a uniform finite-difference computational mesh without invoking horizontal or vertical coordinate transformations. An unstructured grid version of TRIM model is introduced, or UnTRIM (pronounces as "you trim"), which preserves these basic numerical properties and modeling philosophy, only the computations are carried out over an unstructured orthogonal grid. The unstructured grid offers the flexibilities in representing complex study areas so that fine grid resolution can be placed in regions of interest, and coarse grids are used to cover the remaining domain. Thus, the computational efforts are concentrated in areas of importance, and an overall computational saving can be achieved because the total number of grid-points is dramatically reduced. To use this modeling approach, an unstructured grid mesh must be generated to properly reflect the properties of the domain of the investigation. The new modeling flexibility in grid structure is accompanied by new challenges associated with issues of grid generation. To take full advantage of this new model flexibility, the model grid generation should be guided by insights into the physics of the problems; and the insights needed may require a higher degree of modeling skill.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Ball, William P.
2017-04-01
Regression-based approaches are often employed to estimate riverine constituent concentrations and fluxes based on typically sparse concentration observations. One such approach is the recently developed WRTDS ("Weighted Regressions on Time, Discharge, and Season") method, which has been shown to provide more accurate estimates than prior approaches in a wide range of applications. Centered on WRTDS, this work was aimed at developing improved models for constituent concentration and flux estimation by accounting for antecedent discharge conditions. Twelve modified models were developed and tested, each of which contains one additional flow variable to represent antecedent conditions and which can be directly derived from the daily discharge record. High-resolution (∼daily) data at nine diverse monitoring sites were used to evaluate the relative merits of the models for estimation of six constituents - chloride (Cl), nitrate-plus-nitrite (NOx), total Kjeldahl nitrogen (TKN), total phosphorus (TP), soluble reactive phosphorus (SRP), and suspended sediment (SS). For each site-constituent combination, 30 concentration subsets were generated from the original data through Monte Carlo subsampling and then used to evaluate model performance. For the subsampling, three sampling strategies were adopted: (A) 1 random sample each month (12/year), (B) 12 random monthly samples plus additional 8 random samples per year (20/year), and (C) flow-stratified sampling with 12 regular (non-storm) and 8 storm samples per year (20/year). Results reveal that estimation performance varies with both model choice and sampling strategy. In terms of model choice, the modified models show general improvement over the original model under all three sampling strategies. Major improvements were achieved for NOx by the long-term flow-anomaly model and for Cl by the ADF (average discounted flow) model and the short-term flow-anomaly model. Moderate improvements were achieved for SS, TP, and TKN by the ADF model. By contrast, no such achievement was achieved for SRP by any proposed model. In terms of sampling strategy, performance of all models (including the original) was generally best using strategy C and worst using strategy A, and especially so for SS, TP, and SRP, confirming the value of routinely collecting stormflow samples. Overall, this work provides a comprehensive set of statistical evidence for supporting the incorporation of antecedent discharge conditions into the WRTDS model for estimation of constituent concentration and flux, thereby combining the advantages of two recent developments in water quality modeling.
NASA Technical Reports Server (NTRS)
Johnson, R. W.
1974-01-01
A mathematical model of an ecosystem is developed. Secondary productivity is evaluated in terms of man related and controllable factors. Information from an existing physical parameters model is used as well as pertinent biological measurements. Predictive information of value to estuarine management is presented. Biological, chemical, and physical parameters measured in order to develop models of ecosystems are identified.
A low-order model of the equatorial ocean-atmosphere system
NASA Astrophysics Data System (ADS)
Legnani, Roberto
A low order model of the equatorial ocean-atmosphere coupled system is presented. The model atmosphere includes a hydrological cycle with cloud-radiation interaction. The model ocean is based on mixed layer dynamics with a parameterization of entrainment processes. The coupling takes place via transfer to momentum, sensible heat, latent heat and short wave and long wave radiation through the ocean surface. The dynamical formulation is that of the primitive equations of an equatorial beta-plane, with zonally periodic and meridionally infinite geometry. The system is expanded into the set of normal modes pertinent to the linear problem and severly truncated to a few modes; 54 degrees of freedom are retained. Some nonlinear terms of the equations are evaluated in physical space and then projected onto the functional space; other terms are evaluated directly in the functional space. Sensitivity tests to variations of the parameters are performed, and some results from 10-year initial value simulations are presented. The model is capable of supporting oscillations of different time scales, ranging from a few days to a few years; it prefers a particular zonally asymmetric state, but temporarily switches to a different (opposite) zonally asymmetric state in an event-like fashion.
a Low-Order Model of the Equatorial Ocean-Atmosphere System.
NASA Astrophysics Data System (ADS)
Legnani, Roberto
A low order model of the equatorial ocean-atmosphere coupled system is presented. The model atmosphere includes a hydrological cycle with cloud-radiation interaction. The model ocean is based on mixed layer dynamics with a parameterization of entrainment processes. The coupling takes place via transfer to momentum, sensible heat, latent heat and short -wave and long-wave radiation through the ocean surface. The dynamical formulation is that of the primitive equations of an equatorial beta-plane, with zonally periodic and meridionally infinite geometry. The system is expanded into the set of normal modes pertinent to the linear problem and severely truncated to a few modes; 54 degrees of freedom are retained. Some nonlinear terms of the equations are evaluated in physical space and then projected onto the functional space; other terms are evaluated directly in the functional space. Sensitivity tests to variations of the parameters are performed, and some results from 10-year initial value simulations are presented. The model is capable of supporting oscillations of different time scales, ranging from a few days to a few years; it prefers a particular zonally asymmetric state, but temporarily switches to a different (opposite) zonally asymmetric state in an event-like fashion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatt, Uma S.; Wackerbauer, Renate; Polyakov, Igor V.
The goal of this research was to apply fractional and non-linear analysis techniques in order to develop a more complete characterization of climate change and variability for the oceanic, sea ice and atmospheric components of the Earth System. This research applied two measures of dynamical characteristics of time series, the R/S method of calculating the Hurst exponent and Renyi entropy, to observational and modeled climate data in order to evaluate how well climate models capture the long-term dynamics evident in observations. Fractional diffusion analysis was applied to ARGO ocean buoy data to quantify ocean transport. Self organized maps were appliedmore » to North Pacific sea level pressure and analyzed in ways to improve seasonal predictability for Alaska fire weather. This body of research shows that these methods can be used to evaluate climate models and shed light on climate mechanisms (i.e., understanding why something happens). With further research, these methods show promise for improving seasonal to longer time scale forecasts of climate.« less
A Case Study on Sepsis Using PubMed and Deep Learning for Ontology Learning.
Arguello Casteleiro, Mercedes; Maseda Fernandez, Diego; Demetriou, George; Read, Warren; Fernandez Prieto, Maria Jesus; Des Diz, Julio; Nenadic, Goran; Keane, John; Stevens, Robert
2017-01-01
We investigate the application of distributional semantics models for facilitating unsupervised extraction of biomedical terms from unannotated corpora. Term extraction is used as the first step of an ontology learning process that aims to (semi-)automatic annotation of biomedical concepts and relations from more than 300K PubMed titles and abstracts. We experimented with both traditional distributional semantics methods such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) as well as the neural language models CBOW and Skip-gram from Deep Learning. The evaluation conducted concentrates on sepsis, a major life-threatening condition, and shows that Deep Learning models outperform LSA and LDA with much higher precision.
Linking agent-based models and stochastic models of financial markets
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene
2012-01-01
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086
Linking agent-based models and stochastic models of financial markets.
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene
2012-05-29
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.
Valsecchi, M G; Silvestri, D; Sasieni, P
1996-12-30
We consider methodological problems in evaluating long-term survival in clinical trials. In particular we examine the use of several methods that extend the basic Cox regression analysis. In the presence of a long term observation, the proportional hazard (PH) assumption may easily be violated and a few long term survivors may have a large effect on parameter estimates. We consider both model selection and robust estimation in a data set of 474 ovarian cancer patients enrolled in a clinical trial and followed for between 7 and 12 years after randomization. Two diagnostic plots for assessing goodness-of-fit are introduced. One shows the variation in time of parameter estimates and is an alternative to PH checking based on time-dependent covariates. The other takes advantage of the martingale residual process in time to represent the lack of fit with a metric of the type 'observed minus expected' number of events. Robust estimation is carried out by maximizing a weighted partial likelihood which downweights the contribution to estimation of influential observations. This type of complementary analysis of long-term results of clinical studies is useful in assessing the soundness of the conclusions on treatment effect. In the example analysed here, the difference in survival between treatments was mostly confined to those individuals who survived at least two years beyond randomization.
Implementation of an Integrated On-Board Aircraft Engine Diagnostic Architecture
NASA Technical Reports Server (NTRS)
Armstrong, Jeffrey B.; Simon, Donald L.
2012-01-01
An on-board diagnostic architecture for aircraft turbofan engine performance trending, parameter estimation, and gas-path fault detection and isolation has been developed and evaluated in a simulation environment. The architecture incorporates two independent models: a realtime self-tuning performance model providing parameter estimates and a performance baseline model for diagnostic purposes reflecting long-term engine degradation trends. This architecture was evaluated using flight profiles generated from a nonlinear model with realistic fleet engine health degradation distributions and sensor noise. The architecture was found to produce acceptable estimates of engine health and unmeasured parameters, and the integrated diagnostic algorithms were able to perform correct fault isolation in approximately 70 percent of the tested cases
Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Kim, Jinwon; Krishna, Bhargavi
2015-08-31
The Alpha 2 release is the second release from the LASSO Pilot Phase that builds upon the Alpha 1 release. Alpha 2 contains additional diagnostics in the data bundles and focuses on cases from spring-summer 2016. A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input include model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.
NASA Astrophysics Data System (ADS)
Merker, L.; Costi, T. A.
2012-08-01
We introduce a method to obtain the specific heat of quantum impurity models via a direct calculation of the impurity internal energy requiring only the evaluation of local quantities within a single numerical renormalization group (NRG) calculation for the total system. For the Anderson impurity model we show that the impurity internal energy can be expressed as a sum of purely local static correlation functions and a term that involves also the impurity Green function. The temperature dependence of the latter can be neglected in many cases, thereby allowing the impurity specific heat Cimp to be calculated accurately from local static correlation functions; specifically via Cimp=(∂Eionic)/(∂T)+(1)/(2)(∂Ehyb)/(∂T), where Eionic and Ehyb are the energies of the (embedded) impurity and the hybridization energy, respectively. The term involving the Green function can also be evaluated in cases where its temperature dependence is non-negligible, adding an extra term to Cimp. For the nondegenerate Anderson impurity model, we show by comparison with exact Bethe ansatz calculations that the results recover accurately both the Kondo induced peak in the specific heat at low temperatures as well as the high-temperature peak due to the resonant level. The approach applies to multiorbital and multichannel Anderson impurity models with arbitrary local Coulomb interactions. An application to the Ohmic two-state system and the anisotropic Kondo model is also given, with comparisons to Bethe ansatz calculations. The approach could also be of interest within other impurity solvers, for example, within quantum Monte Carlo techniques.
Evaluation of Subgrid-Scale Models for Large Eddy Simulation of Compressible Flows
NASA Technical Reports Server (NTRS)
Blaisdell, Gregory A.
1996-01-01
The objective of this project was to evaluate and develop subgrid-scale (SGS) turbulence models for large eddy simulations (LES) of compressible flows. During the first phase of the project results from LES using the dynamic SGS model were compared to those of direct numerical simulations (DNS) of compressible homogeneous turbulence. The second phase of the project involved implementing the dynamic SGS model in a NASA code for simulating supersonic flow over a flat-plate. The model has been successfully coded and a series of simulations has been completed. One of the major findings of the work is that numerical errors associated with the finite differencing scheme used in the code can overwhelm the SGS model and adversely affect the LES results. Attached to this overview are three submitted papers: 'Evaluation of the Dynamic Model for Simulations of Compressible Decaying Isotropic Turbulence'; 'The effect of the formulation of nonlinear terms on aliasing errors in spectral methods'; and 'Large-Eddy Simulation of a Spatially Evolving Compressible Boundary Layer Flow'.
Methods for evaluating the predictive accuracy of structural dynamic models
NASA Technical Reports Server (NTRS)
Hasselman, Timothy K.; Chrostowski, Jon D.
1991-01-01
Modeling uncertainty is defined in terms of the difference between predicted and measured eigenvalues and eigenvectors. Data compiled from 22 sets of analysis/test results was used to create statistical databases for large truss-type space structures and both pretest and posttest models of conventional satellite-type space structures. Modeling uncertainty is propagated through the model to produce intervals of uncertainty on frequency response functions, both amplitude and phase. This methodology was used successfully to evaluate the predictive accuracy of several structures, including the NASA CSI Evolutionary Structure tested at Langley Research Center. Test measurements for this structure were within + one-sigma intervals of predicted accuracy for the most part, demonstrating the validity of the methodology and computer code.
The Impact of Credit on Village Economies
Kaboski, Joseph P.; Townsend, Robert M.
2011-01-01
This paper evaluates the short-term impact of Thailand’s ‘Million Baht Village Fund’program, among the largest scale government microfinance iniative in the world, using pre- and post-program panel data and quasi-experimental cross-village variation in credit-per-household. We find that the village funds have increased total short-term credit, consumption, agricultural investment, income growth (from business and labor), but decreased overall asset growth. We also find a positive impact on wages, an important general equilibrium effect. The findings are broadly consistent qualitatively with models of credit-constrained household behavior and models of intermediation and growth. PMID:22844546
Rocket exhaust effluent modeling for tropospheric air quality and environmental assessments
NASA Technical Reports Server (NTRS)
Stephens, J. B.; Stewart, R. B.
1977-01-01
The various techniques for diffusion predictions to support air quality predictions and environmental assessments for aerospace applications are discussed in terms of limitations imposed by atmospheric data. This affords an introduction to the rationale behind the selection of the National Aeronautics and Space Administration (NASA)/Marshall Space Flight Center (MSFC) Rocket Exhaust Effluent Diffusion (REED) program. The models utilized in the NASA/MSFC REED program are explained. This program is then evaluated in terms of some results from a joint MSFC/Langley Research Center/Kennedy Space Center Titan Exhaust Effluent Prediction and Monitoring Program.
Modeling Vortex Generators in the Wind-US Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2010-01-01
A source term model which simulates the effects of vortex generators was implemented into the Wind-US Navier Stokes code. The source term added to the Navier-Stokes equations simulates the lift force which would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, supersonic flow in a rectangular duct with a counterrotating vortex generator pair, and subsonic flow in an S-duct with 22 co-rotating vortex generators. The validation results indicate that the source term vortex generator model provides a useful tool for screening vortex generator configurations and gives comparable results to solutions computed using a gridded vane.
A critical evaluation of monkey models of amnesia and dementia.
Ridley, R M; Baker, H F
1991-01-01
In this review we consider various models of amnesia and dementia in monkeys and examine the validity of such models. In Section 2 we describe the various types of memory tests (tasks) available for use with monkeys and discuss the extent to which these tasks assess different facets of memory according to present theories of human memory. We argue that the rules which govern correct task performance are best regarded as a form of semantic rather than procedural memory, and that when information about stimulus attributes or reward associations is stored long-term then that knowledge is semantic. The demonstration of episodic memory in monkeys is problematic and the term recognition memory has been used too loosely. In particular, it is difficult to dissociate episodic memory for stimulus events from the use of semantic memory for the rule of the task, since dysfunction of either can produce impairment on performance of the same task. Tasks can also be divided into those which assess memory for stimulus-reward associations (evaluative memory) and those which tax stimulus-response associations including spatial and conditional responding (non-evaluative memory). This dissociation cuts across the distinction between semantic and episodic memory. In Section 3 we examine the usefulness of the classification of tasks described in Section 2 in clarifying our understanding of the contribution of the temporal lobes and the cholinergic system to memory. We conclude that evaluative and non-evaluative memory are mediated by separate parallel systems involving the amygdala and hippocampus, respectively.
A critical evaluation of two-equation models for near wall turbulence
NASA Technical Reports Server (NTRS)
Speziale, Charles G.; Abid, Ridha; Anderson, E. Clay
1990-01-01
A variety of two-equation turbulence models,including several versions of the K-epsilon model as well as the K-omega model, are analyzed critically for near wall turbulent flows from a theoretical and computational standpoint. It is shown that the K-epsilon model has two major problems associated with it: the lack of natural boundary conditions for the dissipation rate and the appearance of higher-order correlations in the balance of terms for the dissipation rate at the wall. In so far as the former problem is concerned, either physically inconsistent boundary conditions have been used or the boundary conditions for the dissipation rate have been tied to higher-order derivatives of the turbulent kinetic energy which leads to numerical stiffness. The K-omega model can alleviate these problems since the asymptotic behavior of omega is known in more detail and since its near wall balance involves only exact viscous terms. However, the modeled form of the omega equation that is used in the literature is incomplete-an exact viscous term is missing which causes the model to behave in an asymptotically inconsistent manner. By including this viscous term and by introducing new wall damping functions with improved asymptotic behavior, a new K-tau model (where tau is identical with 1/omega is turbulent time scale) is developed. It is demonstrated that this new model is computationally robust and yields improved predictions for turbulent boundary layers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neary, Vincent Sinclair; Yang, Zhaoqing; Wang, Taiping
A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending onmore » the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.« less
Newfoundland and Labrador: 80/20 staffing model pilot in a long-term care facility.
Stuckless, Trudy; Power, Margaret
2012-03-01
This project, based in Newfoundland and Labrador's Central Regional Health Authority, is the first application of an 80/20 staffing model to a long-term care facility in Canada. The model allows nurse participants to spend 20% of their paid time pursuing a professional development activity instead of providing direct patient care. Newfoundland and Labrador has the highest aging demographic in Canada owing, in part, to the out-migration of younger adults. Recruiting and retaining nurses to work in long-term care in the province is difficult; at the same time, the increasing acuity of long-term care residents and their complex care needs mean that nurses must assume greater leadership roles in these facilities. This project set out to increase capacity for registered nurse (RN) leadership, training and support and to enhance the profile of long-term care as a place to work. Six RNs and one licensed practical nurse (LPN) participated and engaged in a range of professional development activities. Several of the participants are now pursuing further nursing educational activities. Central Health plans to continue a 90/10 model for one RN and one LPN per semester, with the timeframe to be determined. The model will be evaluated and, if it is deemed successful, the feasibility of implementing it in other sites throughout the region will be explored.
A comparison of linear interpolation models for iterative CT reconstruction.
Hahn, Katharina; Schöndube, Harald; Stierstorfer, Karl; Hornegger, Joachim; Noo, Frédéric
2016-12-01
Recent reports indicate that model-based iterative reconstruction methods may improve image quality in computed tomography (CT). One difficulty with these methods is the number of options available to implement them, including the selection of the forward projection model and the penalty term. Currently, the literature is fairly scarce in terms of guidance regarding this selection step, whereas these options impact image quality. Here, the authors investigate the merits of three forward projection models that rely on linear interpolation: the distance-driven method, Joseph's method, and the bilinear method. The authors' selection is motivated by three factors: (1) in CT, linear interpolation is often seen as a suitable trade-off between discretization errors and computational cost, (2) the first two methods are popular with manufacturers, and (3) the third method enables assessing the importance of a key assumption in the other methods. One approach to evaluate forward projection models is to inspect their effect on discretized images, as well as the effect of their transpose on data sets, but significance of such studies is unclear since the matrix and its transpose are always jointly used in iterative reconstruction. Another approach is to investigate the models in the context they are used, i.e., together with statistical weights and a penalty term. Unfortunately, this approach requires the selection of a preferred objective function and does not provide clear information on features that are intrinsic to the model. The authors adopted the following two-stage methodology. First, the authors analyze images that progressively include components of the singular value decomposition of the model in a reconstructed image without statistical weights and penalty term. Next, the authors examine the impact of weights and penalty on observed differences. Image quality metrics were investigated for 16 different fan-beam imaging scenarios that enabled probing various aspects of all models. The metrics include a surrogate for computational cost, as well as bias, noise, and an estimation task, all at matched resolution. The analysis revealed fundamental differences in terms of both bias and noise. Task-based assessment appears to be required to appreciate the differences in noise; the estimation task the authors selected showed that these differences balance out to yield similar performance. Some scenarios highlighted merits for the distance-driven method in terms of bias but with an increase in computational cost. Three combinations of statistical weights and penalty term showed that the observed differences remain the same, but strong edge-preserving penalty can dramatically reduce the magnitude of these differences. In many scenarios, Joseph's method seems to offer an interesting compromise between cost and computational effort. The distance-driven method offers the possibility to reduce bias but with an increase in computational cost. The bilinear method indicated that a key assumption in the other two methods is highly robust. Last, strong edge-preserving penalty can act as a compensator for insufficiencies in the forward projection model, bringing all models to similar levels in the most challenging imaging scenarios. Also, the authors find that their evaluation methodology helps appreciating how model, statistical weights, and penalty term interplay together.
NASA Technical Reports Server (NTRS)
French, V. (Principal Investigator)
1982-01-01
An evaluation was made of Thompson-Type models which use trend terms (as a surrogate for technology), meteorological variables based on monthly average temperature, and total precipitation to forecast and estimate corn yields in Iowa, Illinois, and Indiana. Pooled and unpooled Thompson-type models were compared. Neither was found to be consistently superior to the other. Yield reliability indicators show that the models are of limited use for large area yield estimation. The models are objective and consistent with scientific knowledge. Timely yield forecasts and estimates can be made during the growing season by using normals or long range weather forecasts. The models are not costly to operate and are easy to use and understand. The model standard errors of prediction do not provide a useful current measure of modeled yield reliability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Prashant, E-mail: prashantkumar@csio.res.in; Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030; Bansod, Baban K.S.
2015-02-15
Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models havemore » been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.« less
Evaluation of trends in wheat yield models
NASA Technical Reports Server (NTRS)
Ferguson, M. C.
1982-01-01
Trend terms in models for wheat yield in the U.S. Great Plains for the years 1932 to 1976 are evaluated. The subset of meteorological variables yielding the largest adjusted R(2) is selected using the method of leaps and bounds. Latent root regression is used to eliminate multicollinearities, and generalized ridge regression is used to introduce bias to provide stability in the data matrix. The regression model used provides for two trends in each of two models: a dependent model in which the trend line is piece-wise continuous, and an independent model in which the trend line is discontinuous at the year of the slope change. It was found that the trend lines best describing the wheat yields consisted of combinations of increasing, decreasing, and constant trend: four combinations for the dependent model and seven for the independent model.
Tangen, Brian A.; Gleason, Robert A.; Stamm, John F.
2013-01-01
Many wetland impoundments managed by the U.S. Fish and Wildlife Service (USFWS) National Wildlife Refuge System throughout the northern Great Plains rely on rivers as a primary water source. A large number of these impoundments currently are being stressed from changes in water supplies and quality, and these problems are forecast to worsen because of projected changes to climate and land use. For example, many managed wetlands in arid regions have become degraded owing to the long-term accumulation of salts and increased salinity associated with evapotranspiration. A primary goal of the USFWS is to provide aquatic habitats for a diversity of waterbirds; thus, wetland managers would benefit from a tool that facilitates evaluation of wetland habitat quality in response to current and anticipated impacts of altered hydrology and salt balances caused by factors such as climate change, water availability, and management actions. A spreadsheet model that simulates the overall water and salinity balance (WSB model) of managed wetland impoundments is presented. The WSB model depicts various habitat metrics, such as water depth, salinity, and surface areas (inundated, dry), which can be used to evaluate alternative management actions under various water-availability and climate scenarios. The WSB model uses widely available spreadsheet software, is relatively simple to use, relies on widely available inputs, and is readily adaptable to specific locations. The WSB model was validated using data from three National Wildlife Refuges with direct and indirect connections to water resources associated with rivers, and common data limitations are highlighted. The WSB model also was used to conduct simulations based on hypothetical climate and management scenarios to demonstrate the utility of the model for evaluating alternative management strategies and climate futures. The WSB model worked well across a range of National Wildlife Refuges and could be a valuable tool for USFWS staff when evaluating system state and management alternatives and establishing long-term goals and objectives.
Watershed Models for Decision Support for Inflows to Potholes Reservoir, Washington
Mastin, Mark C.
2009-01-01
A set of watershed models for four basins (Crab Creek, Rocky Ford Creek, Rocky Coulee, and Lind Coulee), draining into Potholes Reservoir in east-central Washington, was developed as part of a decision support system to aid the U.S. Department of the Interior, Bureau of Reclamation, in managing water resources in east-central Washington State. The project is part of the U.S. Geological Survey and Bureau of Reclamation collaborative Watershed and River Systems Management Program. A conceptual model of hydrology is outlined for the study area that highlights the significant processes that are important to accurately simulate discharge under a wide range of conditions. The conceptual model identified the following factors as significant for accurate discharge simulations: (1) influence of frozen ground on peak discharge, (2) evaporation and ground-water flow as major pathways in the system, (3) channel losses, and (4) influence of irrigation practices on reducing or increasing discharge. The Modular Modeling System was used to create a watershed model for the four study basins by combining standard Precipitation Runoff Modeling System modules with modified modules from a previous study and newly modified modules. The model proved unreliable in simulating peak-flow discharge because the index used to track frozen ground conditions was not reliable. Mean monthly and mean annual discharges were more reliable when simulated. Data from seven USGS streamflow-gaging stations were used to compare with simulated discharge for model calibration and evaluation. Mean annual differences between simulated and observed discharge varied from 1.2 to 13.8 percent for all stations used in the comparisons except one station on a regional ground-water discharge stream. Two thirds of the mean monthly percent differences between the simulated mean and the observed mean discharge for these six stations were between -20 and 240 percent, or in absolute terms, between -0.8 and 11 cubic feet per second. A graphical user interface was developed for the user to easily run the model, make runoff forecasts, and evaluate the results. The models; however, are not reliable for managing short-term operations because of their demonstrated inability to match individual storm peaks and individual monthly discharge values. Short-term forecasting may be improved with real-time monitoring of the extent of frozen ground and the snow-water equivalent in the basin. Despite the models unreliability for short-term runoff forecasts, they are useful in providing long-term, time-series discharge data where no observed data exist.
Bi-local holography in the SYK model: Perturbations
Jevicki, Antal; Suzuki, Kenta
2016-11-08
We continue the study of the Sachdev-Ye-Kitaev model in the Large N limit. Following our formulation in terms of bi-local collective fields with dynamical reparametrization symmetry, we perform perturbative calculations around the conformal IR point. As a result, these are based on an ε expansion which allows for analytical evaluation of correlators and finite temperature quantities.
Evaluation of Teaching the IS-LM Model through a Simulation Program
ERIC Educational Resources Information Center
Pablo-Romero, Maria del Populo; Pozo-Barajas, Rafael; Gomez-Calero, Maria de la Palma
2012-01-01
The IS-ML model is a basic tool used in the teaching of short-term macroeconomics. Teaching is essentially done through the use of graphs. However, the way these graphs are traditionally taught does not allow the learner to easily visualise changes in the curves. The IS-LM simulation program overcomes difficulties encountered in understanding the…
Round Robin evaluation of soil moisture retrieval models for the MetOp-A ASCAT Instrument
NASA Astrophysics Data System (ADS)
Gruber, Alexander; Paloscia, Simonetta; Santi, Emanuele; Notarnicola, Claudia; Pasolli, Luca; Smolander, Tuomo; Pulliainen, Jouni; Mittelbach, Heidi; Dorigo, Wouter; Wagner, Wolfgang
2014-05-01
Global soil moisture observations are crucial to understand hydrologic processes, earth-atmosphere interactions and climate variability. ESA's Climate Change Initiative (CCI) project aims to create a global consistent long-term soil moisture data set based on the merging of the best available active and passive satellite-based microwave sensors and retrieval algorithms. Within the CCI, a Round Robin evaluation of existing retrieval algorithms for both active and passive instruments was carried out. In this study we present the comparison of five different retrieval algorithms covering three different modelling principles applied to active MetOp-A ASCAT L1 backscatter data. These models include statistical models (Bayesian Regression and Support Vector Regression, provided by the Institute for Applied Remote Sensing, Eurac Research Viale Druso, Italy, and an Artificial Neural Network, provided by the Institute of Applied Physics, CNR-IFAC, Italy), a semi-empirical model (provided by the Finnish Meteorological Institute), and a change detection model (provided by the Vienna University of Technology). The algorithms were applied on L1 backscatter data within the period of 2007-2011, resampled to a 12.5 km grid. The evaluation was performed over 75 globally distributed, quality controlled in situ stations drawn from the International Soil Moisture Network (ISMN) using surface soil moisture data from the Global Land Data Assimilation System (GLDAS-) Noah land surface model as second independent reference. The temporal correlation between the data sets was analyzed and random errors of the the different algorithms were estimated using the triple collocation method. Absolute soil moisture values as well as soil moisture anomalies were considered including both long-term anomalies from the mean seasonal cycle and short-term anomalies from a five weeks moving average window. Results show a very high agreement between all five algorithms for most stations. A slight vegetation dependency of the errors and a spatial decorrelation of the performance patterns of the different algorithms was found. We conclude that future research should focus on understanding, combining and exploiting the advantages of all available modelling approaches rather than trying to optimize one approach to fit every possible condition.
NASA Technical Reports Server (NTRS)
Tuttle, M. E.; Brinson, H. F.
1986-01-01
The impact of flight error in measured viscoelastic parameters on subsequent long-term viscoelastic predictions is numerically evaluated using the Schapery nonlinear viscoelastic model. Of the seven Schapery parameters, the results indicated that long-term predictions were most sensitive to errors in the power law parameter n. Although errors in the other parameters were significant as well, errors in n dominated all other factors at long times. The process of selecting an appropriate short-term test cycle so as to insure an accurate long-term prediction was considered, and a short-term test cycle was selected using material properties typical for T300/5208 graphite-epoxy at 149 C. The process of selection is described, and its individual steps are itemized.
Low Reynolds number k-epsilon modelling with the aid of direct simulation data
NASA Technical Reports Server (NTRS)
Rodi, W.; Mansour, N. N.
1993-01-01
The constant C sub mu and the near-wall damping function f sub mu in the eddy-viscosity relation of the k-epsilon model are evaluated from direct numerical simulation (DNS) data for developed channel and boundary layer flow at two Reynolds numbers each. Various existing f sub mu model functions are compared with the DNS data, and a new function is fitted to the high-Reynolds-number channel flow data. The epsilon-budget is computed for the fully developed channel flow. The relative magnitude of the terms in the epsilon-equation is analyzed with the aid of scaling arguments, and the parameter governing this magnitude is established. Models for the sum of all source and sink terms in the epsilon-equation are tested against the DNS data, and an improved model is proposed.
[Citizen constitution and social representations: reflecting about health care models].
da Silva, Sílvio Eder Dias; Ramos, Flávia Regina Souza; Martins, Cleusa Rios; Padilha, Maria Itayra; Vasconcelos, Esleane Vilela
2010-12-01
This article presents a reflection on the meaning of the terms citizenship and health, addressing the Theory of Social Representations as a strategy for implementing and evaluating health care models in Brazil. First, a brief history about the concept of citizenship is presented; then the article addresses the principles of freedom and equality according to Kant; the third section of the article shows that health is as a right of the citizen and a duty of the state. Finally, the Theory of Social Representations is emphasized as a strategy to evaluate and implement the health services provided to citizens by the current health care models in Brazil.
Drug Evaluation in the Plasmodium Falciparum - Aotus Model
1994-03-15
falciparum infections. Althogh erythromycin is inactive against chloroquine -resistant falciparum infections, an analogue , azithromycin, is effective in vitro...response to chloroquine , and then expand the evaluation of WR 238605, a primaquine analogue against infections. Each cyopreserved sample was thawed rapidly...confirmedo.4 chloroquine -sensitive p. via -strai-n[as not Infective for unaltered Panamanian Aotus. 14. SUBJECT TERMS 15. NUMBER OF PAGES Malaria
Svetlana A. (Kushch) Schroder; Sandor F. Toth; Robert L. Deal; Gregory J. Ettl
2016-01-01
Forest owners worldwide are increasingly interested in managing forests to provide a broad suite of Ecosystem services, balancing multiple objectives and evaluating management activities in terms of Potential tradeoffs. We describe a multi-objective mathematical programming model to quantify tradeoffs in expected sediment delivery and the preservation of Northern...
An Evaluation of the Impact of a Wellness Course in the Undergraduate Psychology Curriculum.
ERIC Educational Resources Information Center
Kushner, Richard I.; Hartigan, Phyllis
Wellness and holistic health models, which focus on life style as a major component of long term health, are thriving throughout the United States. To evaluate the impact of an undergraduate psycholgoy course dealing with health enhancement, wellness, and prevention issues, 24 college freshmen enrolled in one of two courses for a 10-week period: a…
Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux
NASA Astrophysics Data System (ADS)
Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.
2017-12-01
Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.
Monitoring and modeling of long-term settlements of an experimental landfill in Brazil.
Simões, Gustavo Ferreira; Catapreta, Cícero Antônio Antunes
2013-02-01
Settlement evaluation in sanitary landfills is a complex process, due to the waste heterogeneity, time-varying properties and influencing factors and mechanisms, such as mechanical compression due to load application and creep, and physical-chemical and biological processes caused by the wastes decomposition. Many empirical models for the analysis of long-term settlement in landfills are reported in the literature. This paper presents the results of a settlement monitoring program carried out during 6 years in Belo Horizonte experimental landfill. Different sets of field data were used to calibrate three long-term settlement prediction models (rheological, hyperbolic and composite). The parameters obtained in the calibration were used to predict the settlements and to compare with actual field data. During the monitoring period of 6 years, significant vertical strains were observed (of up to 31%) in relation to the initial height of the experimental landfill. The results for the long-term settlement prediction obtained by the hyperbolic and rheological models significantly underestimate the settlements, regardless the period of data used in the calibration. The best fits were obtained with the composite model, except when 1 year field data were used in the calibration. The results of the composite model indicate settlements stabilization at larger times and with larger final settlements when compared to the hyperbolic and rheological models. Copyright © 2012 Elsevier Ltd. All rights reserved.
Electromagnetic interference modeling and suppression techniques in variable-frequency drive systems
NASA Astrophysics Data System (ADS)
Yang, Le; Wang, Shuo; Feng, Jianghua
2017-11-01
Electromagnetic interference (EMI) causes electromechanical damage to the motors and degrades the reliability of variable-frequency drive (VFD) systems. Unlike fundamental frequency components in motor drive systems, high-frequency EMI noise, coupled with the parasitic parameters of the trough system, are difficult to analyze and reduce. In this article, EMI modeling techniques for different function units in a VFD system, including induction motors, motor bearings, and rectifierinverters, are reviewed and evaluated in terms of applied frequency range, model parameterization, and model accuracy. The EMI models for the motors are categorized based on modeling techniques and model topologies. Motor bearing and shaft models are also reviewed, and techniques that are used to eliminate bearing current are evaluated. Modeling techniques for conventional rectifierinverter systems are also summarized. EMI noise suppression techniques, including passive filter, Wheatstone bridge balance, active filter, and optimized modulation, are reviewed and compared based on the VFD system models.
Li, F Y; Yan, S Q; Huang, K; Mao, L J; Pan, W J; Ge, X; Han, Y; Hao, J H; Tao, F B
2017-12-10
Objective: To evaluate the relations between hypertensive disorders (HDP) in pregnancy and early-term birth. Methods: A total of 3 474 pregnant women were consecutively recruited. Demographic information was collected in early pregnancy. HDP was diagnosed in the first, second and third trimesters, respectively. On the basis of precise evaluation on gestation age, early-term birth was defined as gestational age of 37-38 weeks+6 days. Logistic regression models were conducted to examine the associations between HDP and early-term birth. Results: The current study included 3 260 pregnant women, with the rates of HDP, pregnancy-induced hypertension syndrome and pre-eclampsia as 6.0% ( n =194), 4.2% ( n =137) and 1.8% ( n =57), respectively. After controlling for potential confounders, no significant differences between pregnancy-induced hypertension syndrome and earlyterm birth ( OR =1.49, 95% CI : 0.94-2.36) were found. Pre-eclampsia appeared to have increased the risk of early-term birth ( OR =4.46, 95% CI : 2.09-9.54). Conclusion: Pre-eclampsia could significantly increase the risk of early-term birth. This finding suggested that early detection and intervention programs were helpful in reducing the risk of early-term birth.
Climate-tree growth models in relation to long-term growth trends of white oak in Pennsylvania
D. D. Davis; R. P. Long
2003-01-01
We examined long-term growth trends of white oak by comparing tree-ring chronologies developed from an old-growth stand, where the average tree age was 222 years, with a second-growth stand where average tree age was 78 years. Evaluation of basal area growth trends suggested that an anomalous decrease in basal area increment trend occurred in both stands during the...
NASA Astrophysics Data System (ADS)
Dugger, A. L.; Rafieeinasab, A.; Gochis, D.; Yu, W.; McCreight, J. L.; Karsten, L. R.; Pan, L.; Zhang, Y.; Sampson, K. M.; Cosgrove, B.
2016-12-01
Evaluation of physically-based hydrologic models applied across large regions can provide insight into dominant controls on runoff generation and how these controls vary based on climatic, biological, and geophysical setting. To make this leap, however, we need to combine knowledge of regional forcing skill, model parameter and physics assumptions, and hydrologic theory. If we can successfully do this, we also gain information on how well our current approximations of these dominant physical processes are represented in continental-scale models. In this study, we apply this diagnostic approach to a 5-year retrospective implementation of the WRF-Hydro community model configured for the U.S. National Weather Service's National Water Model (NWM). The NWM is a water prediction model in operations over the contiguous U.S. as of summer 2016, providing real-time estimates and forecasts out to 30 days of streamflow across 2.7 million stream reaches as well as distributed snowpack, soil moisture, and evapotranspiration at 1-km resolution. The WRF-Hydro system permits not only the standard simulation of vertical energy and water fluxes common in continental-scale models, but augments these processes with lateral redistribution of surface and subsurface water, simple groundwater dynamics, and channel routing. We evaluate 5 years of NLDAS-2 precipitation forcing and WRF-Hydro streamflow and evapotranspiration simulation across the contiguous U.S. at a range of spatial (gage, basin, ecoregion) and temporal (hourly, daily, monthly) scales and look for consistencies and inconsistencies in performance in terms of bias, timing, and extremes. Leveraging results from other CONUS-scale hydrologic evaluation studies, we translate our performance metrics into a matrix of likely dominant process controls and error sources (forcings, parameter estimates, and model physics). We test our hypotheses in a series of controlled model experiments on a subset of representative basins from distinct "problem" environments (Southeast U.S. Coastal Plain, Central and Coastal Texas, Northern Plains, and Arid Southwest). The results from these longer-term model diagnostics will inform future improvements in forcing bias correction, parameter calibration, and physics developments in the National Water Model.
Model-free adaptive speed control on travelling wave ultrasonic motor
NASA Astrophysics Data System (ADS)
Di, Sisi; Li, Huafeng
2018-01-01
This paper introduced a new data-driven control (DDC) method for the speed control of ultrasonic motor (USM). The model-free adaptive control (MFAC) strategy was presented in terms of its principles, algorithms, and parameter selection. To verify the efficiency of the proposed method, a speed-frequency-time model, which contained all the measurable nonlinearity and uncertainties based on experimental data was established for simulation to mimic the USM operation system. Furthermore, the model was identified using particle swarm optimization (PSO) method. Then, the control of the simulated system using MFAC was evaluated under different expectations in terms of overshoot, rise time and steady-state error. Finally, the MFAC results were compared with that of proportion iteration differentiation (PID) to demonstrate its advantages in controlling general random system.
Modeling of Long-Term Evolution of Hydrophysical Fields of the Black Sea
NASA Astrophysics Data System (ADS)
Dorofeyev, V. L.; Sukhikh, L. I.
2017-11-01
The long-term evolution of the Black Sea dynamics (1980-2020) is reconstructed by numerical simulation. The model of the Black Sea circulation has 4.8 km horizontal spatial resolution and 40 levels in z-coordinates. The mixing processes in the upper layer are parameterized by Mellor-Yamada turbulent model. For the sea surface boundary conditions, atmospheric forcing functions were used, provided for the Black Sea region by the Euro mediterranean Center on Climate Change (CMCC) from the COSMO-CLM regional climate model. These data have a spatial resolution of 14 km and a daily temporal resolution. To evaluate the quality of the hydrodynamic fields derived from the simulation, they were compared with in-situ hydrological measurements and similar results from physical reanalysis of the Black Sea.
Pereira, T; Armada-da Silva, P A S; Amorim, I; Rêma, A; Caseiro, A R; Gärtner, A; Rodrigues, M; Lopes, M A; Bártolo, P J; Santos, J D; Luís, A L; Maurício, A C
2014-01-01
Skeletal muscle has good regenerative capacity, but the extent of muscle injury and the developed fibrosis might prevent complete regeneration. The in vivo application of human mesenchymal stem cells (HMSCs) of the umbilical cord and the conditioned media (CM) where the HMSCs were cultured and expanded, associated with different vehicles to induce muscle regeneration, was evaluated in a rat myectomy model. Two commercially available vehicles and a spherical hydrogel developed by our research group were used. The treated groups obtained interesting results in terms of muscle regeneration, both in the histological and in the functional assessments. A less evident scar tissue, demonstrated by collagen type I quantification, was present in the muscles treated with HMSCs or their CM. In terms of the histological evaluation performed by ISO 10993-6 scoring, it was observed that HMSCs apparently have a long-term negative effect, since the groups treated with CM presented better scores. CM could be considered an alternative to the in vivo transplantation of these cells, as it can benefit from the local tissue response to secreted molecules with similar results in terms of muscular regeneration. Searching for an optimal vehicle might be the key point in the future of skeletal muscle tissue engineering.
Pereira, T.; Armada-da Silva, P. A. S.; Amorim, I.; Rêma, A.; Caseiro, A. R.; Gärtner, A.; Rodrigues, M.; Lopes, M. A.; Bártolo, P. J.; Santos, J. D.; Luís, A. L.; Maurício, A. C.
2014-01-01
Skeletal muscle has good regenerative capacity, but the extent of muscle injury and the developed fibrosis might prevent complete regeneration. The in vivo application of human mesenchymal stem cells (HMSCs) of the umbilical cord and the conditioned media (CM) where the HMSCs were cultured and expanded, associated with different vehicles to induce muscle regeneration, was evaluated in a rat myectomy model. Two commercially available vehicles and a spherical hydrogel developed by our research group were used. The treated groups obtained interesting results in terms of muscle regeneration, both in the histological and in the functional assessments. A less evident scar tissue, demonstrated by collagen type I quantification, was present in the muscles treated with HMSCs or their CM. In terms of the histological evaluation performed by ISO 10993-6 scoring, it was observed that HMSCs apparently have a long-term negative effect, since the groups treated with CM presented better scores. CM could be considered an alternative to the in vivo transplantation of these cells, as it can benefit from the local tissue response to secreted molecules with similar results in terms of muscular regeneration. Searching for an optimal vehicle might be the key point in the future of skeletal muscle tissue engineering. PMID:25379040
Monsanto, Pedro; Almeida, Nuno; Lrias, Clotilde; Pina, Jos Eduardo; Sofia, Carlos
2013-01-01
Maddrey discriminant function (DF) is the traditional model for evaluating the severity and prognosis in alcoholic hepatitis (AH). However, MELD has also been used for this purpose. We aimed to determine the predictive parameters and compare the ability of Maddrey DF and MELD to predict short-term mortality in patients with AH. Retrospective study of 45 patients admitted in our department with AH between 2000 and 2010. Demographic, clinical and laboratory parameters were collected. MELD and Maddrey DF were calculated on admission. Short-term mortality was assessed at 30 and 90 days. Student t-test, χ2 test, univariate analysis, logistic regression and receiver operating characteristic curves were performed. Thirty-day and 90-day mortality was 27% and 42%, respectively. In multivariate analysis, Maddrey DF was the only independent predictor of mortality for these two periods. Receiver operating characteristic curves for Maddrey DF revealed an excellent discriminatory ability to predict 30-day and 90-day mortality for a Maddrey DF greater than 65 and 60, respectively. Discriminatory ability to predict 30-day and 90-day mortality for MELD was low. AH remains associated with a high short-term mortality. Maddrey DF is a more valuable model than MELD to predict short-term mortality in patients with AH.
M. T. Kiefer; S. Zhong; W. E. Heilman; J. J. Charney; X. Bian
2013-01-01
Efforts to develop a canopy flow modeling system based on the Advanced Regional Prediction System (ARPS) model are discussed. The standard version of ARPS is modified to account for the effect of drag forces on mean and turbulent flow through a vegetation canopy, via production and sink terms in the momentum and subgrid-scale turbulent kinetic energy (TKE) equations....
Computer-Based Model Calibration and Uncertainty Analysis: Terms and Concepts
2015-07-01
uncertainty analyses throughout the lifecycle of planning, designing, and operating of Civil Works flood risk management projects as described in...value 95% of the time. In the frequentist approach to PE, model parameters area regarded as having true values, and their estimate is based on the...in catchment models. 1. Evaluating parameter uncertainty. Water Resources Research 19(5):1151–1172. Lee, P. M. 2012. Bayesian statistics: An
Quality of Life Changes Following Surgery for Hyperhidrosis.
de Campos, José Ribas Milanez; da Fonseca, Hugo Veiga Sampaio; Wolosker, Nelson
2016-11-01
The best way to evaluate the impact of primary hyperhidrosis on quality of life (QL) is through specific questionnaires, avoiding generic models that do not appropriately evaluate individuals. QL improves significantly in the short term after sympathectomy. In the longer term, a sustained and stable improvement is seen, although there is a small decline in the numbers; after 5 and even at 10 years of follow-up it shows virtually the same numerical distribution. Compensatory hyperhidrosis is a major side effect and the main aggravating factor in postoperative QL, requiring attention to its management and prevention. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Manowitz, D. H.; Schwab, D. E.; Izaurralde, R. C.
2010-12-01
As bioenergy production continues to increase, it is important to be able to predict not only the crop yields that are expected from future production, but also the various environmental impacts that will accompany it. Therefore, models that can be used to make such predictions must be validated against as many of these agricultural outputs as possible. The Environmental Policy Integrated Climate (EPIC) model is a widely used and tested model for simulating many agricultural ecosystem processes including plant growth, crop yield, carbon and nutrient cycling, wind and water erosion, runoff, leaching, as well as changes in soil physical and chemical properties. This model has undergone many improvements, including the addition of a process-based denitrification submodel. Here we evaluate the performance of EPIC in its ability to simulate nitrous oxide (N2O) fluxes and related variables as observed in selected treatments of the Long-Term Ecological Research (LTER) cropping systems study at Kellogg Biological Station (KBS). We will provide a brief description of the EPIC model in the context of bioenergy production, describe the denitrification submodel, and compare simulated and observed values of crop yields, N2O emissions, soil carbon dynamics, and soil moisture.
Chen, Chieh-Fan; Ho, Wen-Hsien; Chou, Huei-Yin; Yang, Shu-Mei; Chen, I-Te; Shi, Hon-Yi
2011-01-01
This study analyzed meteorological, clinical and economic factors in terms of their effects on monthly ED revenue and visitor volume. Monthly data from January 1, 2005 to September 30, 2009 were analyzed. Spearman correlation and cross-correlation analyses were performed to identify the correlation between each independent variable, ED revenue, and visitor volume. Autoregressive integrated moving average (ARIMA) model was used to quantify the relationship between each independent variable, ED revenue, and visitor volume. The accuracies were evaluated by comparing model forecasts to actual values with mean absolute percentage of error. Sensitivity of prediction errors to model training time was also evaluated. The ARIMA models indicated that mean maximum temperature, relative humidity, rainfall, non-trauma, and trauma visits may correlate positively with ED revenue, but mean minimum temperature may correlate negatively with ED revenue. Moreover, mean minimum temperature and stock market index fluctuation may correlate positively with trauma visitor volume. Mean maximum temperature, relative humidity and stock market index fluctuation may correlate positively with non-trauma visitor volume. Mean maximum temperature and relative humidity may correlate positively with pediatric visitor volume, but mean minimum temperature may correlate negatively with pediatric visitor volume. The model also performed well in forecasting revenue and visitor volume. PMID:22203886
Chen, Chieh-Fan; Ho, Wen-Hsien; Chou, Huei-Yin; Yang, Shu-Mei; Chen, I-Te; Shi, Hon-Yi
2011-01-01
This study analyzed meteorological, clinical and economic factors in terms of their effects on monthly ED revenue and visitor volume. Monthly data from January 1, 2005 to September 30, 2009 were analyzed. Spearman correlation and cross-correlation analyses were performed to identify the correlation between each independent variable, ED revenue, and visitor volume. Autoregressive integrated moving average (ARIMA) model was used to quantify the relationship between each independent variable, ED revenue, and visitor volume. The accuracies were evaluated by comparing model forecasts to actual values with mean absolute percentage of error. Sensitivity of prediction errors to model training time was also evaluated. The ARIMA models indicated that mean maximum temperature, relative humidity, rainfall, non-trauma, and trauma visits may correlate positively with ED revenue, but mean minimum temperature may correlate negatively with ED revenue. Moreover, mean minimum temperature and stock market index fluctuation may correlate positively with trauma visitor volume. Mean maximum temperature, relative humidity and stock market index fluctuation may correlate positively with non-trauma visitor volume. Mean maximum temperature and relative humidity may correlate positively with pediatric visitor volume, but mean minimum temperature may correlate negatively with pediatric visitor volume. The model also performed well in forecasting revenue and visitor volume.
Uncertainty in Analyzed Water and Energy Budgets at Continental Scales
NASA Technical Reports Server (NTRS)
Bosilovich, Michael G.; Robertson, F. R.; Mocko, D.; Chen, J.
2011-01-01
Operational analyses and retrospective-analyses provide all the physical terms of mater and energy budgets, guided by the assimilation of atmospheric observations. However, there is significant reliance on the numerical models, and so, uncertainty in the budget terms is always present. Here, we use a recently developed data set consisting of a mix of 10 analyses (both operational and retrospective) to quantify the uncertainty of analyzed water and energy budget terms for GEWEX continental-scale regions, following the evaluation of Dr. John Roads using individual reanalyses data sets.
NASA Astrophysics Data System (ADS)
Memmesheimer, M.; Friese, E.; Jakobs, H. J.; Feldmann, H.; Ebel, A.; Kerschgens, M. J.
During recent years the interest in long-term applications of air pollution modeling systems (AQMS) has strongly increased. Most of these models have been developed for the application to photo-oxidant episodes during the last decade. In this contribu- tion a long-term application of the EURAD modeling sytem to the year 1997 is pre- sented. Atmospheric particles are included using the Modal Aerosol Dynamics Model for Europe (MADE). Meteorological fields are simulated by the mesoscale meteoro- logical model MM5, gas-phase chemistry has been treated with the RACM mecha- nism. The nesting option is used to zoom in areas of specific interest. Horizontal grid sizes are 125 km for the reginal scale, and 5 km for the local scale covering the area of North-Rhine-Westfalia (NRW). The results have been compared to observations of the air quality network of the environmental agency of NRW for the year 1997. The model results have been evaluated using the data quality objectives of the EU direc- tive 99/30. Further improvement for application of regional-scale air quality models is needed with respect to emission data bases, coupling to global models to improve the boundary values, interaction between aerosols and clouds and multiphase modeling.
A survey of Applied Psychological Services' models of the human operator
NASA Technical Reports Server (NTRS)
Siegel, A. I.; Wolf, J. J.
1979-01-01
A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.
Cohen, J; Millier, A; Karray, S; Toumi, M
2013-01-01
Switching drugs from prescription to non-prescription status (Rx-to-OTC) presents a unique set of challenges and opportunities to policy-makers and the industry in terms of managing health outcomes, pharmaceutical spending, and steering of consumer choices of therapy. Decision-analytic models are used to address uncertainty and produce reasonable estimates of the economic impact of switches for payers. This article presents a critical literature review of existing models which assess the economic impact of Rx-to-OTC switches, and provides guidelines in which future economic evaluations of Rx-to-OTC switches could be improved. A comprehensive search strategy was implemented in Medline and Embase, to retrieve published economic evaluations on Rx-to-OTC switches from 1995-2010. The research digest of the International Society of Pharmacoeconomics and Outcomes Research (ISPOR) was reviewed for potentially relevant abstracts for the past 3 years. Each model used was critically evaluated in terms of structure, relevance of inputs, methodology used, and robustness of results. Worldwide, the economic impact of Rx-to-OTC switches has only been evaluated in a total of 12 peer-reviewed publications. Ten out of 12 studies were US-based, and two European-based. The models covered various disease categories, including allergy, hypercholesterolemia, gastroenterology, contraception, pulmonology, and virology. Seventy-five per cent of the models predicted cost savings for payers and patients. Limitations of the models mainly included use of strong assumptions and non-inclusion of specific populations due to lack of data. Guidelines were developed to help future model development. They cover structural issues on decision context, health states, and clinical outcomes, and other considerations for model specifications. Although reviewed studies lacked quality, this review of economic evidence of Rx-to-OTC switches suggests that switches may produce cost savings to public and private payers. This is especially important in light of the trend towards more switches.
Modeling Vortex Generators in a Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2011-01-01
A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.
da Fonseca Neto, João Viana; Abreu, Ivanildo Silva; da Silva, Fábio Nogueira
2010-04-01
Toward the synthesis of state-space controllers, a neural-genetic model based on the linear quadratic regulator design for the eigenstructure assignment of multivariable dynamic systems is presented. The neural-genetic model represents a fusion of a genetic algorithm and a recurrent neural network (RNN) to perform the selection of the weighting matrices and the algebraic Riccati equation solution, respectively. A fourth-order electric circuit model is used to evaluate the convergence of the computational intelligence paradigms and the control design method performance. The genetic search convergence evaluation is performed in terms of the fitness function statistics and the RNN convergence, which is evaluated by landscapes of the energy and norm, as a function of the parameter deviations. The control problem solution is evaluated in the time and frequency domains by the impulse response, singular values, and modal analysis.
Quantitative comparison between crowd models for evacuation planning and evaluation
NASA Astrophysics Data System (ADS)
Viswanathan, Vaisagh; Lee, Chong Eu; Lees, Michael Harold; Cheong, Siew Ann; Sloot, Peter M. A.
2014-02-01
Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we describe a procedure to quantitatively compare different crowd models or between models and real-world data. We simulated three models: (1) the lattice gas model, (2) the social force model, and (3) the RVO2 model, and obtained the distributions of six observables: (1) evacuation time, (2) zoned evacuation time, (3) passage density, (4) total distance traveled, (5) inconvenience, and (6) flow rate. We then used the DISTATIS procedure to compute the compromise matrix of statistical distances between the three models. Projecting the three models onto the first two principal components of the compromise matrix, we find the lattice gas and RVO2 models are similar in terms of the evacuation time, passage density, and flow rates, whereas the social force and RVO2 models are similar in terms of the total distance traveled. Most importantly, we find that the zoned evacuation times of the three models to be very different from each other. Thus we propose to use this variable, if it can be measured, as the key test between different models, and also between models and the real world. Finally, we compared the model flow rates against the flow rate of an emergency evacuation during the May 2008 Sichuan earthquake, and found the social force model agrees best with this real data.
KNGEOID14: A national hybrid geoid model in Korea
NASA Astrophysics Data System (ADS)
Kang, S.; Sung, Y. M.; KIM, H.; Kim, Y. S.
2016-12-01
This study describes in brief the construction of a national hybrid geoid model in Korea, KNGEOID14, which can be used as an accurate vertical datum in/around Korea. The hybrid geoid model should be determined by fitting the gravimetric geoid to the geometric geoid undulations from GNSS/Leveling data which were presented the local vertical level. For developing the gravimetric geoid model, we determined all frequency parts (long, middle and short-frequency) of gravimetric geoid using all available data with optimal remove-restore technique based on EGM2008 reference surface. In remove-restore technique, the EGM2008 model to degree 360, RTM reduction method were used for calculating the long, middle and short-frequency part of gravimetric geoid, respectively. A number of gravity data compiled for modeling the middle-frequency part, residual geoid, containing 8,866 points gravity data on land and ocean areas. And, the DEM data gridded by 100m×100m were used for short-frequency part, is the topographic effect on the geoid generated by RTM method. The accuracy of gravimetric geoid model were evaluated by comparison with GNSS/Leveling data was about -0.362m ± 0.055m. Finally, we developed the national hybrid geoid model in Korea, KNGEOID14, corrected to gravimetric geoid with the correction term by fitting the about 1,200 GNSS/Leveling data on Korean bench marks. The correction term is modeled using the difference between GNSS/Leveling derived geoidal heights and gravimetric geoidal heights. The stochastic model used in the calculation of correction term is the LSC technique based on second-order Markov covariance function. The post-fit error (mean and std. dev.) of the KNGEOID14 model was evaluated as 0.001m ± 0.033m. Concerning the result of this study, the accurate orthometric height at any points in Korea will be easily and precisely calculated by combining the geoidal height from KNGEOID14 and ellipsoidal height from GPS observation technique.
Langtimm, Catherine A.; Kendall, William L.; Beck, Cathy A.; Kochman, Howard I.; Teague, Amy L.; Meigs-Friend, Gaia; Peñaloza, Claudia L.
2016-11-30
This report provides supporting details and evidence for the rationale, validity and efficacy of a new mark-recapture model, the Barker Robust Design, to estimate regional manatee survival rates used to parameterize several components of the 2012 version of the Manatee Core Biological Model (CBM) and Threats Analysis (TA). The CBM and TA provide scientific analyses on population viability of the Florida manatee subspecies (Trichechus manatus latirostris) for U.S. Fish and Wildlife Service’s 5-year reviews of the status of the species as listed under the Endangered Species Act. The model evaluation is presented in a standardized reporting framework, modified from the TRACE (TRAnsparent and Comprehensive model Evaluation) protocol first introduced for environmental threat analyses. We identify this new protocol as TRACE-MANATEE SURVIVAL and this model evaluation specifically as TRACE-MANATEE SURVIVAL, Barker RD version 1. The longer-term objectives of the manatee standard reporting format are to (1) communicate to resource managers consistent evaluation information over sequential modeling efforts; (2) build understanding and expertise on the structure and function of the models; (3) document changes in model structures and applications in response to evolving management objectives, new biological and ecological knowledge, and new statistical advances; and (4) provide greater transparency for management and research review.
Leffa, Douglas Teixeira; Bellaver, Bruna; Salvi, Artur Alban; de Oliveira, Carla; Caumo, Wolnei; Grevet, Eugenio Horacio; Fregni, Felipe; Quincozes-Santos, André; Rohde, Luis Augusto; Torres, Iraci L S
2018-04-05
Transcranial direct current stimulation (tDCS) is a technique that modulates neuronal activity and has been proposed as a potential therapeutic tool for attention-deficit/hyperactivity disorder (ADHD) symptoms. Although pilot studies have shown evidence of efficacy, its mechanism of action remains unclear. We evaluated the effects of tDCS on behavioral (working and long-term memory) and neurochemical (oxidative and inflammatory parameters) outcomes related to ADHD pathophysiology. We used the most widely accepted animal model of ADHD: spontaneously hypertensive rats (SHR). The selected behavioral outcomes have been shown to be altered in both ADHD patients and animal models, and were chosen for their relation to the proposed mechanistic action of tDCS. Adult male SHR and their control, the Wistar Kyoto rats (WKY), were subjected to 20 min of bicephalic tDCS or sham stimulation for 8 consecutive days. Working memory, long-term memory, and neurochemical outcomes were evaluated. TDCS improved long-term memory deficits presented by the SHR. No change in working memory performance was observed. In the hippocampus, tDCS increased both the production of reactive oxygen species in SHR and the levels of the antioxidant molecule glutathione in both strains. TDCS also modulated inflammatory response in the brains of WKY by downregulating pro-inflammatory cytokines. TDCS had significant effects that were specific for strain, type of behavioral and neurochemical outcomes. The long-term memory improvement in the SHR may point to a possible therapeutic role of tDCS in ADHD that does not seem to be mediated by inflammatory markers. Additionally, the anti-inflammatory effects observed in the brain of WKY after tDCS needs to be further explored. Copyright © 2018 Elsevier Inc. All rights reserved.
Bijleveld, Yuma A; de Haan, Timo R; van der Lee, Johanna H; Groenendaal, Floris; Dijk, Peter H; van Heijst, Arno; de Jonge, Rogier C J; Dijkman, Koen P; van Straaten, Henrica L M; Rijken, Monique; Zonnenberg, Inge A; Cools, Filip; Zecic, Alexandra; Nuytemans, Debbie H G M; van Kaam, Anton H; Mathôt, Ron A A
2018-04-01
The pharmacokinetic (PK) properties of intravenous (i.v.) benzylpenicillin in term neonates undergoing moderate hypothermia after perinatal asphyxia were evaluated, as they have been unknown until now. A system-specific modeling approach was applied, in which our recently developed covariate model describing developmental and temperature-induced changes in amoxicillin clearance (CL) in the same patient study population was incorporated into a population PK model of benzylpenicillin with a priori birthweight (BW)-based allometric scaling. Pediatric population covariate models describing the developmental changes in drug elimination may constitute system-specific information and may therefore be incorporated into PK models of drugs cleared through the same pathway. The performance of this system-specific model was compared to that of a reference model. Furthermore, Monte-Carlo simulations were performed to evaluate the optimal dose. The system-specific model performed as well as the reference model. Significant correlations were found between CL and postnatal age (PNA), gestational age (GA), body temperature (TEMP), urine output (UO; system-specific model), and multiorgan failure (reference model). For a typical patient with a GA of 40 weeks, BW of 3,000 g, PNA of 2 days (TEMP, 33.5°C), and normal UO (2 ml/kg/h), benzylpenicillin CL was 0.48 liter/h (interindividual variability [IIV] of 49%) and the volume of distribution of the central compartment was 0.62 liter/kg (IIV of 53%) in the system-specific model. Based on simulations, we advise a benzylpenicillin i.v. dose regimen of 75,000 IU/kg/day every 8 h (q8h), 150,000 IU/kg/day q8h, and 200,000 IU/kg/day q6h for patients with GAs of 36 to 37 weeks, 38 to 41 weeks, and ≥42 weeks, respectively. The system-specific model may be used for other drugs cleared through the same pathway accelerating model development. Copyright © 2018 American Society for Microbiology.
NASA Technical Reports Server (NTRS)
Matsui, Toshihisa; Chern, Jiun-Dar; Tao, Wei-Kuo; Lang, Stephen E.; Satoh, Masaki; Hashino, Tempei; Kubota, Takuji
2016-01-01
A 14-year climatology of Tropical Rainfall Measuring Mission (TRMM) collocated multi-sensor signal statistics reveal a distinct land-ocean contrast as well as geographical variability of precipitation type, intensity, and microphysics. Microphysics information inferred from the TRMM precipitation radar and Microwave Imager (TMI) show a large land-ocean contrast for the deep category, suggesting continental convective vigor. Over land, TRMM shows higher echo-top heights and larger maximum echoes, suggesting taller storms and more intense precipitation, as well as larger microwave scattering, suggesting the presence of morelarger frozen convective hydrometeors. This strong land-ocean contrast in deep convection is invariant over seasonal and multi-year time-scales. Consequently, relatively short-term simulations from two global storm-resolving models can be evaluated in terms of their land-ocean statistics using the TRMM Triple-sensor Three-step Evaluation via a satellite simulator. The models evaluated are the NASA Multi-scale Modeling Framework (MMF) and the Non-hydrostatic Icosahedral Cloud Atmospheric Model (NICAM). While both simulations can represent convective land-ocean contrasts in warm precipitation to some extent, near-surface conditions over land are relatively moisture in NICAM than MMF, which appears to be the key driver in the divergent warm precipitation results between the two models. Both the MMF and NICAM produced similar frequencies of large CAPE between land and ocean. The dry MMF boundary layer enhanced microwave scattering signals over land, but only NICAM had an enhanced deep convection frequency over land. Neither model could reproduce a realistic land-ocean contrast in in deep convective precipitation microphysics. A realistic contrast between land and ocean remains an issue in global storm-resolving modeling.
Decision analytic models for Alzheimer's disease: state of the art and future directions.
Cohen, Joshua T; Neumann, Peter J
2008-05-01
Decision analytic policy models for Alzheimer's disease (AD) enable researchers and policy makers to investigate questions about the costs and benefits of a wide range of existing and potential screening, testing, and treatment strategies. Such models permit analysts to compare existing alternatives, explore hypothetical scenarios, and test the strength of underlying assumptions in an explicit, quantitative, and systematic way. Decision analytic models can best be viewed as complementing clinical trials both by filling knowledge gaps not readily addressed by empirical research and by extrapolating beyond the surrogate markers recorded in a trial. We identified and critiqued 13 distinct AD decision analytic policy models published since 1997. Although existing models provide useful insights, they also have a variety of limitations. (1) They generally characterize disease progression in terms of cognitive function and do not account for other distinguishing features, such as behavioral symptoms, functional performance, and the emotional well-being of AD patients and caregivers. (2) Many describe disease progression in terms of a limited number of discrete states, thus constraining the level of detail that can be used to characterize both changes in patient status and the relationships between disease progression and other factors, such as residential status, that influence outcomes of interest. (3) They have focused almost exclusively on evaluating drug treatments, thus neglecting other disease management strategies and combinations of pharmacologic and nonpharmacologic interventions. Future AD models should facilitate more realistic and compelling evaluations of various interventions to address the disease. An improved model will allow decision makers to better characterize the disease, to better assess the costs and benefits of a wide range of potential interventions, and to better evaluate the incremental costs and benefits of specific interventions used in conjunction with other disease management strategies.
FY15 Report on Thermomechanical Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Francis D.; Buchholz, Stuart
2015-08-01
Sandia is participating in the third phase of a United States (US)-German Joint Project that compares constitutive models and simulation procedures on the basis of model calculations of the thermomechanical behavior and healing of rock salt (Salzer et al. 2015). The first goal of the project is to evaluate the ability of numerical modeling tools to correctly describe the relevant deformation phenomena in rock salt under various influences. Among the numerical modeling tools required to address this are constitutive models that are used in computer simulations for the description of the thermal, mechanical, and hydraulic behavior of the host rockmore » under various influences and for the long-term prediction of this behavior. Achieving this goal will lead to increased confidence in the results of numerical simulations related to the secure disposal of radioactive wastes in rock salt. Results of the Joint Project may ultimately be used to make various assertions regarding stability analysis of an underground repository in salt during the operating phase as well as long-term integrity of the geological barrier in the post-operating phase A primary evaluation of constitutive model capabilities comes by way of predicting large-scale field tests. The Joint Project partners decided to model Waste Isolation Pilot Plant (WIPP) Rooms B & D which are full-scale rooms having the same dimensions. Room D deformed under natural, ambient conditions while Room B was thermally driven by an array of waste-simulating heaters (Munson et al. 1988; 1990). Existing laboratory test data for WIPP salt were carefully scrutinized and the partners decided that additional testing would be needed to help evaluate advanced features of the constitutive models. The German partners performed over 140 laboratory tests on WIPP salt at no charge to the US Department of Energy (DOE).« less
IT vendor selection model by using structural equation model & analytical hierarchy process
NASA Astrophysics Data System (ADS)
Maitra, Sarit; Dominic, P. D. D.
2012-11-01
Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.
Ayres, D R; Pereira, R J; Boligon, A A; Silva, F F; Schenkel, F S; Roso, V M; Albuquerque, L G
2013-12-01
Cattle resistance to ticks is measured by the number of ticks infesting the animal. The model used for the genetic analysis of cattle resistance to ticks frequently requires logarithmic transformation of the observations. The objective of this study was to evaluate the predictive ability and goodness of fit of different models for the analysis of this trait in cross-bred Hereford x Nellore cattle. Three models were tested: a linear model using logarithmic transformation of the observations (MLOG); a linear model without transformation of the observations (MLIN); and a generalized linear Poisson model with residual term (MPOI). All models included the classificatory effects of contemporary group and genetic group and the covariates age of animal at the time of recording and individual heterozygosis, as well as additive genetic effects as random effects. Heritability estimates were 0.08 ± 0.02, 0.10 ± 0.02 and 0.14 ± 0.04 for MLIN, MLOG and MPOI models, respectively. The model fit quality, verified by deviance information criterion (DIC) and residual mean square, indicated fit superiority of MPOI model. The predictive ability of the models was compared by validation test in independent sample. The MPOI model was slightly superior in terms of goodness of fit and predictive ability, whereas the correlations between observed and predicted tick counts were practically the same for all models. A higher rank correlation between breeding values was observed between models MLOG and MPOI. Poisson model can be used for the selection of tick-resistant animals. © 2013 Blackwell Verlag GmbH.
NASA Technical Reports Server (NTRS)
French, V. (Principal Investigator)
1982-01-01
The CEAS models evaluated use historic trend and meteorological and agroclimatic variables to forecast soybean yields in Iowa, Illinois, and Indiana. Indicators of yield reliability and current measures of modeled yield reliability were obtained from bootstrap tests on the end of season models. Indicators of yield reliability show that the state models are consistently better than the crop reporting district (CRD) models. One CRD model is especially poor. At the state level, the bias of each model is less than one half quintal/hectare. The standard deviation is between one and two quintals/hectare. The models are adequate in terms of coverage and are to a certain extent consistent with scientific knowledge. Timely yield estimates can be made during the growing season using truncated models. The models are easy to understand and use and are not costly to operate. Other than the specification of values used to determine evapotranspiration, the models are objective. Because the method of variable selection used in the model development is adequately documented, no evaluation can be made of the objectivity and cost of redevelopment of the model.
Alviz, Antistio Aníbal; Salas, Rubén Darío; Franco, Luis Alberto
2013-01-01
Ceratopteris pteridoides is a semiaquatic fern of the Parkeriacea family, widely used in the Colombian folk medicine as a diuretic and cholelithiasic, of which there are no scientific reports that validate its popular use. To evaluate the acute and short-term repeated-dose diuretic effect of the ethanolic and aqueous extracts of C. pteridoides in an in vivo model. The total ethanolic extract was obtained by maceration of the whole plant of C. pteridoides with ethanol and the aqueous extract by decoction at 60°C for 15 minutes. Both extracts were evaluated in preliminary phytochemical analysis and histological studies after the administration of the extracts for 8 consecutive days (1000 mg/Kg). The diuretic effect was evaluated using Wistar rats treated with the extracts (500 mg/Kg), using an acute and a short-term repeated-dose model, and quantifying water elimination, sodium and potassium excretion by atomic absorption spectrophotometry, and chloride excretion by mercurimetric titration. In the acute model both extracts showed significant diuretic, natriuretic, and kaliuretic effect compared to the control group. Whereas, a short-term repeated-dose administration showed a diuretic effect without elimination of electrolytes. The histopathologic study did not suggest a toxic effect in liver or kidney. The results represent evidence of the diuretic activity of C. pteridoides and give support the popular use given to this plant in the north coast of Colombia. Further studies are required to isolate and identify the compounds responsible for the activity and the mechanism of action involved.
Depeursinge, Adrien; Kurtz, Camille; Beaulieu, Christopher; Napel, Sandy; Rubin, Daniel
2014-08-01
We describe a framework to model visual semantics of liver lesions in CT images in order to predict the visual semantic terms (VST) reported by radiologists in describing these lesions. Computational models of VST are learned from image data using linear combinations of high-order steerable Riesz wavelets and support vector machines (SVM). In a first step, these models are used to predict the presence of each semantic term that describes liver lesions. In a second step, the distances between all VST models are calculated to establish a nonhierarchical computationally-derived ontology of VST containing inter-term synonymy and complementarity. A preliminary evaluation of the proposed framework was carried out using 74 liver lesions annotated with a set of 18 VSTs from the RadLex ontology. A leave-one-patient-out cross-validation resulted in an average area under the ROC curve of 0.853 for predicting the presence of each VST. The proposed framework is expected to foster human-computer synergies for the interpretation of radiological images while using rotation-covariant computational models of VSTs to 1) quantify their local likelihood and 2) explicitly link them with pixel-based image content in the context of a given imaging domain.
A logical foundation for representation of clinical data.
Campbell, K E; Das, A K; Musen, M A
1994-01-01
OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805
A diagram for evaluating multiple aspects of model performance in simulating vector fields
NASA Astrophysics Data System (ADS)
Xu, Zhongfeng; Hou, Zhaolu; Han, Ying; Guo, Weidong
2016-12-01
Vector quantities, e.g., vector winds, play an extremely important role in climate systems. The energy and water exchanges between different regions are strongly dominated by wind, which in turn shapes the regional climate. Thus, how well climate models can simulate vector fields directly affects model performance in reproducing the nature of a regional climate. This paper devises a new diagram, termed the vector field evaluation (VFE) diagram, which is a generalized Taylor diagram and able to provide a concise evaluation of model performance in simulating vector fields. The diagram can measure how well two vector fields match each other in terms of three statistical variables, i.e., the vector similarity coefficient, root mean square length (RMSL), and root mean square vector difference (RMSVD). Similar to the Taylor diagram, the VFE diagram is especially useful for evaluating climate models. The pattern similarity of two vector fields is measured by a vector similarity coefficient (VSC) that is defined by the arithmetic mean of the inner product of normalized vector pairs. Examples are provided, showing that VSC can identify how close one vector field resembles another. Note that VSC can only describe the pattern similarity, and it does not reflect the systematic difference in the mean vector length between two vector fields. To measure the vector length, RMSL is included in the diagram. The third variable, RMSVD, is used to identify the magnitude of the overall difference between two vector fields. Examples show that the VFE diagram can clearly illustrate the extent to which the overall RMSVD is attributed to the systematic difference in RMSL and how much is due to the poor pattern similarity.
Wiethege, J; Ommen, O; Ernstmann, N; Pfaff, H
2010-10-01
Currently, elements of managed care are being implemented in the German health-care system. The legal basis for these innovations are § 140, § 73, § 137, and §§ 63 et seq. of the German Social Code - Part 5 (SGB V). For the model projects according to §§ 63 et seq. of the German Social Code a scientific evaluation and publication of the evaluation results is mandatory. The present study examines the status of evaluation of German model projects. The present study has a mixed method design: A mail and telephone survey with the German Federal Social Insurance Authority, the health insurance funds, and the regional Associations of Statutory Health Insurance Physicians has been conducted. Furthermore, an internet research on "Medpilot" and "Google" has been accomplished to search for model projects and their evaluation reports. 34 model projects met the inclusion criteria. 13 of these projects had been terminated up to 30/9/2008. 6 of them have published an evaluation report. 4 model projects have published substantial documents. One model project in progress has published a meaningful interim report. 12 model projects failed to give information concerning the evaluator or the duration of the model projects. The results show a significant deficit in the mandatory reporting of the evaluation of model projects in Germany. There is a need for action for the legislator and the health insurance funds in terms of promoting the evaluation and the publication of the results. The institutions evaluating the model projects should obligate themselves to publish the evaluation results. The publication is an essential precondition for the development of managed care structures in the health-care system and in the development of scientific evaluation methods. © Georg Thieme Verlag KG Stuttgart · New York.
Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin
2015-02-01
When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Bitew, M. M.; Goodrich, D. C.; Demaria, E.; Heilman, P.; Kautz, M. A.
2017-12-01
Walnut Gulch is a semi-arid environment experimental watershed and Long Term Agro-ecosystem Research (LTAR) site managed by USDA-ARS Southwest Watershed Research Center for which high-resolution long-term hydro-climatic data are available across its 150 km2 drainage area. In this study, we present the analysis of 50 years of continuous hourly rainfall data to evaluate runoff control and generation processes for improving the QA-QC plans of Walnut Gulch to create high-quality data set that is critical for reducing water balance uncertainties. Multiple linear regression models were developed to relate rainfall properties, runoff characteristics and watershed properties. The rainfall properties were summarized to event based total depth, maximum intensity, duration, the location of the storm center with respect to the outlet, and storm size normalized to watershed area. We evaluated the interaction between the runoff and rainfall and runoff as antecedent moisture condition (AMC), antecedent runoff condition (ARC) and, runoff depth and duration for each rainfall events. We summarized each of the watershed properties such as contributing area, slope, shape, channel length, stream density, channel flow area, and percent of the area of retention stock ponds for each of the nested catchments in Walnut Gulch. The evaluation of the model using basic and categorical statistics showed good predictive skill throughout the watersheds. The model produced correlation coefficients ranging from 0.4-0.94, Nash efficiency coefficients up to 0.77, and Kling-Gupta coefficients ranging from 0.4 to 0.98. The model predicted 92% of all runoff generations and 98% of no-runoff across all sub-watersheds in Walnut Gulch. The regression model also indicated good potential to complement the QA-QC procedures in place for Walnut Gulch dataset publications developed over the years since the 1960s through identification of inconsistencies in rainfall and runoff relations.
Inference evaluation in a finite evidence domain
NASA Astrophysics Data System (ADS)
Ratway, Michael J.; Bellomo, Carryn
2000-08-01
Modeling of a target starts with a subject matter expert (SME) analysis of the available sensor(s) data. The SME then forms relationships between the data and known target attributes, called evidence, to support modeling of different types of targets or target activity. Speeds in the interval 10 to 30 knots and ranges less than 30 nautical miles are two samples of target evidence derived from sensor data. Evidence is then organized into sets to define the activities of a target and/or to distinguish different types of targets. For example, near an airport, target activities of takeoff, landing, and holding need to be evaluated in addition to target classification of civilian or commercial aircraft. This paper discusses a method for evaluation of the inferred activities over the finite evidence domain formed from the collection of models under consideration. The methodology accounts for repeated use of evidence in different models. For example, 'near an airport' is a required piece of evidence used repeatedly in the takeoff, landing, and holding models of a wide area sensor. Properties of the activity model evaluator methodology are discussed in terms of model construction and informal results are presented in a Boolean evidence type of problem domain.
NASA Astrophysics Data System (ADS)
Rajaram, H.; Birdsell, D.; Lackey, G.; Karra, S.; Viswanathan, H. S.; Dempsey, D.
2015-12-01
The dramatic increase in the extraction of unconventional oil and gas resources using horizontal wells and hydraulic fracturing (fracking) technologies has raised concerns about potential environmental impacts. Large volumes of hydraulic fracturing fluids are injected during fracking. Incidents of stray gas occurrence in shallow aquifers overlying shale gas reservoirs have been reported; whether these are in any way related to fracking continues to be debated. Computational models serve as useful tools for evaluating potential environmental impacts. We present modeling studies of hydraulic fracturing fluid and gas migration during the various stages of well operation, production, and subsequent plugging. The fluid migration models account for overpressure in the gas reservoir, density contrast between injected fluids and brine, imbibition into partially saturated shale, and well operations. Our results highlight the importance of representing the different stages of well operation consistently. Most importantly, well suction and imbibition both play a significant role in limiting upward migration of injected fluids, even in the presence of permeable connecting pathways. In an overall assessment, our fluid migration simulations suggest very low risk to groundwater aquifers when the vertical separation from a shale gas reservoir is of the order of 1000' or more. Multi-phase models of gas migration were developed to couple flow and transport in compromised wellbores and subsurface formations. These models are useful for evaluating both short-term and long-term scenarios of stray methane release. We present simulation results to evaluate mechanisms controlling stray gas migration, and explore relationships between bradenhead pressures and the likelihood of methane release and transport.
Nonextensive Thomas-Fermi model
NASA Astrophysics Data System (ADS)
Shivamoggi, Bhimsen; Martinenko, Evgeny
2007-11-01
Nonextensive Thomas-Fermi model was father investigated in the following directions: Heavy atom in strong magnetic field. following Shivamoggi work on the extension of Kadomtsev equation we applied nonextensive formalism to father generalize TF model for the very strong magnetic fields (of order 10e12 G). The generalized TF equation and the binding energy of atom were calculated which contain a new nonextensive term dominating the classical one. The binding energy of a heavy atom was also evaluated. Thomas-Fermi equations in N dimensions which is technically the same as in Shivamoggi (1998) ,but behavior is different and in interesting 2 D case nonextesivity prevents from becoming linear ODE as in classical case. Effect of nonextensivity on dielectrical screening reveals itself in the reduction of the envelope radius. It was shown that nonextesivity in each case is responsible for new term dominating classical thermal correction term by order of magnitude, which is vanishing in a limit q->1. Therefore it appears that nonextensive term is ubiquitous for a wide range of systems and father work is needed to understand the origin of it.
pyJac: Analytical Jacobian generator for chemical kinetics
NASA Astrophysics Data System (ADS)
Niemeyer, Kyle E.; Curtis, Nicholas J.; Sung, Chih-Jen
2017-06-01
Accurate simulations of combustion phenomena require the use of detailed chemical kinetics in order to capture limit phenomena such as ignition and extinction as well as predict pollutant formation. However, the chemical kinetic models for hydrocarbon fuels of practical interest typically have large numbers of species and reactions and exhibit high levels of mathematical stiffness in the governing differential equations, particularly for larger fuel molecules. In order to integrate the stiff equations governing chemical kinetics, generally reactive-flow simulations rely on implicit algorithms that require frequent Jacobian matrix evaluations. Some in situ and a posteriori computational diagnostics methods also require accurate Jacobian matrices, including computational singular perturbation and chemical explosive mode analysis. Typically, finite differences numerically approximate these, but for larger chemical kinetic models this poses significant computational demands since the number of chemical source term evaluations scales with the square of species count. Furthermore, existing analytical Jacobian tools do not optimize evaluations or support emerging SIMD processors such as GPUs. Here we introduce pyJac, a Python-based open-source program that generates analytical Jacobian matrices for use in chemical kinetics modeling and analysis. In addition to producing the necessary customized source code for evaluating reaction rates (including all modern reaction rate formulations), the chemical source terms, and the Jacobian matrix, pyJac uses an optimized evaluation order to minimize computational and memory operations. As a demonstration, we first establish the correctness of the Jacobian matrices for kinetic models of hydrogen, methane, ethylene, and isopentanol oxidation (number of species ranging 13-360) by showing agreement within 0.001% of matrices obtained via automatic differentiation. We then demonstrate the performance achievable on CPUs and GPUs using pyJac via matrix evaluation timing comparisons; the routines produced by pyJac outperformed first-order finite differences by 3-7.5 times and the existing analytical Jacobian software TChem by 1.1-2.2 times on a single-threaded basis. It is noted that TChem is not thread-safe, while pyJac is easily parallelized, and hence can greatly outperform TChem on multicore CPUs. The Jacobian matrix generator we describe here will be useful for reducing the cost of integrating chemical source terms with implicit algorithms in particular and algorithms that require an accurate Jacobian matrix in general. Furthermore, the open-source release of the program and Python-based implementation will enable wide adoption.
Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...
2016-02-05
Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less
Technical Note: On the use of nudging for aerosol–climate model intercomparison studies
Zhang, K.; Wan, H.; Liu, X.; ...
2014-08-26
Nudging as an assimilation technique has seen increased use in recent years in the development and evaluation of climate models. Constraining the simulated wind and temperature fields using global weather reanalysis facilitates more straightforward comparison between simulation and observation, and reduces uncertainties associated with natural variabilities of the large-scale circulation. On the other hand, the forcing introduced by nudging can be strong enough to change the basic characteristics of the model climate. In the paper we show that for the Community Atmosphere Model version 5 (CAM5), due to the systematic temperature bias in the standard model and the sensitivity ofmore » simulated ice formation to anthropogenic aerosol concentration, nudging towards reanalysis results in substantial reductions in the ice cloud amount and the impact of anthropogenic aerosols on long-wave cloud forcing. In order to reduce discrepancies between the nudged and unconstrained simulations, and meanwhile take the advantages of nudging, two alternative experimentation methods are evaluated. The first one constrains only the horizontal winds. The second method nudges both winds and temperature, but replaces the long-term climatology of the reanalysis by that of the model. Results show that both methods lead to substantially improved agreement with the free-running model in terms of the top-of-atmosphere radiation budget and cloud ice amount. The wind-only nudging is more convenient to apply, and provides higher correlations of the wind fields, geopotential height and specific humidity between simulation and reanalysis. Results from both CAM5 and a second aerosol–climate model ECHAM6-HAM2 also indicate that compared to the wind-and-temperature nudging, constraining only winds leads to better agreement with the free-running model in terms of the estimated shortwave cloud forcing and the simulated convective activities. This suggests nudging the horizontal winds but not temperature is a good strategy for the investigation of aerosol indirect effects since it provides well-constrained meteorology without strongly perturbing the model's mean climate.« less
Technical Note: On the use of nudging for aerosol-climate model intercomparison studies
NASA Astrophysics Data System (ADS)
Zhang, K.; Wan, H.; Liu, X.; Ghan, S. J.; Kooperman, G. J.; Ma, P.-L.; Rasch, P. J.; Neubauer, D.; Lohmann, U.
2014-08-01
Nudging as an assimilation technique has seen increased use in recent years in the development and evaluation of climate models. Constraining the simulated wind and temperature fields using global weather reanalysis facilitates more straightforward comparison between simulation and observation, and reduces uncertainties associated with natural variabilities of the large-scale circulation. On the other hand, the forcing introduced by nudging can be strong enough to change the basic characteristics of the model climate. In the paper we show that for the Community Atmosphere Model version 5 (CAM5), due to the systematic temperature bias in the standard model and the sensitivity of simulated ice formation to anthropogenic aerosol concentration, nudging towards reanalysis results in substantial reductions in the ice cloud amount and the impact of anthropogenic aerosols on long-wave cloud forcing. In order to reduce discrepancies between the nudged and unconstrained simulations, and meanwhile take the advantages of nudging, two alternative experimentation methods are evaluated. The first one constrains only the horizontal winds. The second method nudges both winds and temperature, but replaces the long-term climatology of the reanalysis by that of the model. Results show that both methods lead to substantially improved agreement with the free-running model in terms of the top-of-atmosphere radiation budget and cloud ice amount. The wind-only nudging is more convenient to apply, and provides higher correlations of the wind fields, geopotential height and specific humidity between simulation and reanalysis. Results from both CAM5 and a second aerosol-climate model ECHAM6-HAM2 also indicate that compared to the wind-and-temperature nudging, constraining only winds leads to better agreement with the free-running model in terms of the estimated shortwave cloud forcing and the simulated convective activities. This suggests nudging the horizontal winds but not temperature is a good strategy for the investigation of aerosol indirect effects since it provides well-constrained meteorology without strongly perturbing the model's mean climate.
A Functional Response Metric for the Temperature Sensitivity of Tropical Ecosystems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keppel-Aleks, Gretchen; Basile, Samantha J.; Hoffman, Forrest M.
Earth system models (ESMs) simulate a large spread in carbon cycle feedbacks to climate change, particularly in their prediction of cumulative changes in terrestrial carbon storage. Evaluating the performance of ESMs against observations and assessing the likelihood of long-term climate predictions are crucial for model development. Here, we assessed the use of atmospheric CO 2 growth rate variations to evaluate the sensitivity of tropical ecosystem carbon fluxes to interannual temperature variations. We found that the temperature sensitivity of the observed CO 2 growth rate depended on the time scales over which atmospheric CO 2 observations were averaged. The temperature sensitivitymore » of the CO 2 growth rate during Northern Hemisphere winter is most directly related to the tropical carbon flux sensitivity since winter variations in Northern Hemisphere carbon fluxes are relatively small. This metric can be used to test the fidelity of interactions between the physical climate system and terrestrial ecosystems within ESMs, which is especially important since the short-term relationship between ecosystem fluxes and temperature stress may be related to the long-term feedbacks between ecosystems and climate. If the interannual temperature sensitivity is used to constrain long-term temperature responses, the inferred sensitivity may be biased by 20%, unless the seasonality of the relationship between the observed CO 2 growth rate and tropical fluxes is taken into account. Lastly, these results suggest that atmospheric data can be used directly to evaluate regional land fluxes from ESMs, but underscore that the interaction between the time scales for land surface processes and those for atmospheric processes must be considered.« less
A Functional Response Metric for the Temperature Sensitivity of Tropical Ecosystems
Keppel-Aleks, Gretchen; Basile, Samantha J.; Hoffman, Forrest M.
2018-04-23
Earth system models (ESMs) simulate a large spread in carbon cycle feedbacks to climate change, particularly in their prediction of cumulative changes in terrestrial carbon storage. Evaluating the performance of ESMs against observations and assessing the likelihood of long-term climate predictions are crucial for model development. Here, we assessed the use of atmospheric CO 2 growth rate variations to evaluate the sensitivity of tropical ecosystem carbon fluxes to interannual temperature variations. We found that the temperature sensitivity of the observed CO 2 growth rate depended on the time scales over which atmospheric CO 2 observations were averaged. The temperature sensitivitymore » of the CO 2 growth rate during Northern Hemisphere winter is most directly related to the tropical carbon flux sensitivity since winter variations in Northern Hemisphere carbon fluxes are relatively small. This metric can be used to test the fidelity of interactions between the physical climate system and terrestrial ecosystems within ESMs, which is especially important since the short-term relationship between ecosystem fluxes and temperature stress may be related to the long-term feedbacks between ecosystems and climate. If the interannual temperature sensitivity is used to constrain long-term temperature responses, the inferred sensitivity may be biased by 20%, unless the seasonality of the relationship between the observed CO 2 growth rate and tropical fluxes is taken into account. Lastly, these results suggest that atmospheric data can be used directly to evaluate regional land fluxes from ESMs, but underscore that the interaction between the time scales for land surface processes and those for atmospheric processes must be considered.« less
Wilbaux, Mélanie; Fuchs, Aline; Samardzic, Janko; Rodieux, Frédérique; Csajka, Chantal; Allegaert, Karel; van den Anker, Johannes N; Pfister, Marc
2016-08-01
Sepsis remains a major cause of mortality and morbidity in neonates, and, as a consequence, antibiotics are the most frequently prescribed drugs in this vulnerable patient population. Growth and dynamic maturation processes during the first weeks of life result in large inter- and intrasubject variability in the pharmacokinetics (PK) and pharmacodynamics (PD) of antibiotics. In this review we (1) summarize the available population PK data and models for primarily renally eliminated antibiotics, (2) discuss quantitative approaches to account for effects of growth and maturation processes on drug exposure and response, (3) evaluate current dose recommendations, and (4) identify opportunities to further optimize and personalize dosing strategies of these antibiotics in preterm and term neonates. Although population PK models have been developed for several of these drugs, exposure-response relationships of primarily renally eliminated antibiotics in these fragile infants are not well understood, monitoring strategies remain inconsistent, and consensus on optimal, personalized dosing of these drugs in these patients is absent. Tailored PK/PD studies and models are useful to better understand relationships between drug exposures and microbiological or clinical outcomes. Pharmacometric modeling and simulation approaches facilitate quantitative evaluation and optimization of treatment strategies. National and international collaborations and platforms are essential to standardize and harmonize not only studies and models but also monitoring and dosing strategies. Simple bedside decision tools assist clinical pharmacologists and neonatologists in their efforts to fine-tune and personalize the use of primarily renally eliminated antibiotics in term and preterm neonates. © 2016, The American College of Clinical Pharmacology.
An epidemic model to evaluate the homogeneous mixing assumption
NASA Astrophysics Data System (ADS)
Turnes, P. P.; Monteiro, L. H. A.
2014-11-01
Many epidemic models are written in terms of ordinary differential equations (ODE). This approach relies on the homogeneous mixing assumption; that is, the topological structure of the contact network established by the individuals of the host population is not relevant to predict the spread of a pathogen in this population. Here, we propose an epidemic model based on ODE to study the propagation of contagious diseases conferring no immunity. The state variables of this model are the percentages of susceptible individuals, infectious individuals and empty space. We show that this dynamical system can experience transcritical and Hopf bifurcations. Then, we employ this model to evaluate the validity of the homogeneous mixing assumption by using real data related to the transmission of gonorrhea, hepatitis C virus, human immunodeficiency virus, and obesity.
NASA Technical Reports Server (NTRS)
Benton, E. R.
1986-01-01
A spherical harmonic representation of the geomagnetic field and its secular variation for epoch 1980, designated GSFC(9/84), is derived and evaluated. At three epochs (1977.5, 1980.0, 1982.5) this model incorporates conservation of magnetic flux through five selected patches of area on the core/mantle boundary bounded by the zero contours of vertical magnetic field. These fifteen nonlinear constraints are included like data in an iterative least squares parameter estimation procedure that starts with the recently derived unconstrained field model GSFC (12/83). Convergence is approached within three iterations. The constrained model is evaluated by comparing its predictive capability outside the time span of its data, in terms of residuals at magnetic observatories, with that for the unconstrained model.
Hayes, Holly; Parchman, Michael L.; Howard, Ray
2012-01-01
Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN. An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators. The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented. PMID:21900441
Performance Evaluation Model for Application Layer Firewalls.
Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan
2016-01-01
Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.
Cohen, Trevor; Schvaneveldt, Roger; Widdows, Dominic
2010-04-01
The discovery of implicit connections between terms that do not occur together in any scientific document underlies the model of literature-based knowledge discovery first proposed by Swanson. Corpus-derived statistical models of semantic distance such as Latent Semantic Analysis (LSA) have been evaluated previously as methods for the discovery of such implicit connections. However, LSA in particular is dependent on a computationally demanding method of dimension reduction as a means to obtain meaningful indirect inference, limiting its ability to scale to large text corpora. In this paper, we evaluate the ability of Random Indexing (RI), a scalable distributional model of word associations, to draw meaningful implicit relationships between terms in general and biomedical language. Proponents of this method have achieved comparable performance to LSA on several cognitive tasks while using a simpler and less computationally demanding method of dimension reduction than LSA employs. In this paper, we demonstrate that the original implementation of RI is ineffective at inferring meaningful indirect connections, and evaluate Reflective Random Indexing (RRI), an iterative variant of the method that is better able to perform indirect inference. RRI is shown to lead to more clearly related indirect connections and to outperform existing RI implementations in the prediction of future direct co-occurrence in the MEDLINE corpus. 2009 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Shevnina, Elena; Kourzeneva, Ekaterina; Kovalenko, Viktor; Vihma, Timo
2017-05-01
Climate warming has been more acute in the Arctic than at lower latitudes and this tendency is expected to continue. This generates major challenges for economic activity in the region. Among other issues is the long-term planning and development of socio-economic infrastructure (dams, bridges, roads, etc.), which require climate-based forecasts of the frequency and magnitude of detrimental flood events. To estimate the cost of the infrastructure and operational risk, a probabilistic form of long-term forecasting is preferable. In this study, a probabilistic model to simulate the parameters of the probability density function (PDF) for multi-year runoff based on a projected climatology is applied to evaluate changes in extreme floods for the territory of the Russian Arctic. The model is validated by cross-comparison of the modelled and empirical PDFs using observations from 23 sites located in northern Russia. The mean values and coefficients of variation (CVs) of the spring flood depth of runoff are evaluated under four climate scenarios, using simulations of six climate models for the period 2010-2039. Regions with substantial expected changes in the means and CVs of spring flood depth of runoff are outlined. For the sites located within such regions, it is suggested to account for the future climate change in calculating the maximal discharges of rare occurrence. An example of engineering calculations for maximal discharges with 1 % exceedance probability is provided for the Nadym River at Nadym.
NASA Technical Reports Server (NTRS)
Verigo, V. V.
1979-01-01
Simulation models were used to study theoretical problems of space biology and medicine. The reaction and adaptation of the main physiological systems to the complex effects of space flight were investigated. Mathematical models were discussed in terms of their significance in the selection of the structure and design of biological life support systems.
New Therapies for Fibrofatty Infiltration
2017-08-01
14. ABSTRACT The goal of this project is to test three classes of compounds in animal models of muscular dystrophy, and evaluate their therapeutic...inhibitor compound to be tested in animal models of disease, as a more efficacious drug was identified with similar substrate specificity. 15...SUBJECT TERMS Fibrofatty infiltration, drug testing , muscular dystrophy, fibrosis. 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER
ERIC Educational Resources Information Center
Gökoglu, Seyfullah; Çakiroglu, Ünal
2017-01-01
The aim of this case study is to evaluate the effect of mentors on teachers' technology integration process into their classrooms. In integration process, interactions between the mentors and the teachers are implemented in terms of Systems-Based Mentoring Model (SBMM). Mentors' leadership roles were determined and changes in teachers' technology…
ERIC Educational Resources Information Center
Gowindasamy, Maniyarasi
2017-01-01
This study was conducted to evaluate the implementation of reflective development model in improving the intercultural competence among business student in Stamford College. This study will be focus on the local and international students in terms of their cultural competencies through the globalization subjects. An embedded design of mixed…
NASA Astrophysics Data System (ADS)
Frost, Andrew J.; Thyer, Mark A.; Srikanthan, R.; Kuczera, George
2007-07-01
SummaryMulti-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box-Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney's main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box-Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moslehi, Salim; Reddy, T. Agami; Katipamula, Srinivas
This research was undertaken to evaluate different inverse models for predicting power output of solar photovoltaic (PV) systems under different practical scenarios. In particular, we have investigated whether PV power output prediction accuracy can be improved if module/cell temperature was measured in addition to climatic variables, and also the extent to which prediction accuracy degrades if solar irradiation is not measured on the plane of array but only on a horizontal surface. We have also investigated the significance of different independent or regressor variables, such as wind velocity and incident angle modifier in predicting PV power output and cell temperature.more » The inverse regression model forms have been evaluated both in terms of their goodness-of-fit, and their accuracy and robustness in terms of their predictive performance. Given the accuracy of the measurements, expected CV-RMSE of hourly power output prediction over the year varies between 3.2% and 8.6% when only climatic data are used. Depending on what type of measured climatic and PV performance data is available, different scenarios have been identified and the corresponding appropriate modeling pathways have been proposed. The corresponding models are to be implemented on a controller platform for optimum operational planning of microgrids and integrated energy systems.« less
ERIC Educational Resources Information Center
Varrella, Gary F.; Luckey, Brian P.; Baca, Jacqueline S.; Peters, Curt
2016-01-01
We present the results of a longitudinal evaluation of the Western Region 4-H Institute, a 5-day training program designed to enhance the skill sets of early-career Extension professionals organized around the 4-H professional research, knowledge, and competencies model. Programs such as this often are assessed for their short-term relevance and…
The dynamics of fidelity over the time course of long-term memory.
Persaud, Kimele; Hemmer, Pernille
2016-08-01
Bayesian models of cognition assume that prior knowledge about the world influences judgments. Recent approaches have suggested that the loss of fidelity from working to long-term (LT) memory is simply due to an increased rate of guessing (e.g. Brady, Konkle, Gill, Oliva, & Alvarez, 2013). That is, recall is the result of either remembering (with some noise) or guessing. This stands in contrast to Bayesian models of cognition while assume that prior knowledge about the world influences judgments, and that recall is a combination of expectations learned from the environment and noisy memory representations. Here, we evaluate the time course of fidelity in LT episodic memory, and the relative contribution of prior category knowledge and guessing, using a continuous recall paradigm. At an aggregate level, performance reflects a high rate of guessing. However, when aggregate data is partitioned by lag (i.e., the number of presentations from study to test), or is un-aggregated, performance appears to be more complex than just remembering with some noise and guessing. We implemented three models: the standard remember-guess model, a three-component remember-guess model, and a Bayesian mixture model and evaluated these models against the data. The results emphasize the importance of taking into account the influence of prior category knowledge on memory. Copyright © 2016 Elsevier Inc. All rights reserved.
Zhao, Huifen; Humphries, Keith; Persons, Derek A.
2016-01-01
Techniques to expand human hematopoietic stem cells ex-vivo could be beneficial to the fields of clinical hematopoietic stem cell transplantation and gene therapy targeted at hematopoietic stem cells. NUP98-HOXA10HD is a relatively newly discovered fusion gene that in mouse transplant experiments has been shown to increase numbers of hematopoietic stem cells. We evaluated whether this fusion gene could be used to expand engrafting human primitive CD34+ cells in an immunodeficient mouse model. Gene transfer was achieved using a lentiviral based vector. The engraftment of mobilized peripheral blood human CD34+ cells grown in culture for one week after gene transfer was evaluated 3–4 months after transplant and found to be 2–3 fold higher in the NUP98-HOXA10HD groups as compared to controls. These data suggest an expansive effect at least at the short term human repopulating cell level. Further evaluation in long term repopulating models and investment in a NUP98-HOXA10HD protein seems worthy of consideration. Additionally, the results here provide strong impetus to utilize NUP98-HOXA10HD as a tool to search for underlying genes and pathways involved in hematopoietic stem cell expansion that can be enhanced and have an even more potent expansive effect. PMID:26761813
NASA Technical Reports Server (NTRS)
Baker, Donald J.
1989-01-01
Part of the results of a U.S. Army/NASA-Langley sponsored research program to establish the long term-term effects of realistic ground based exposure on advanced composite materials is presented. Residual strengths and moisture absorption as a function of exposure time and exposure location are reported for four different composite material systems that were exposed for five years on the North American Continent.
K. Johnson; F. N. Scatena; Y. Pan
2010-01-01
The long-term response of total soil organic carbon pools (âtotal SOCâ, i.e. soil and dead wood) to different harvesting scenarios in even-aged northern hardwood forest stands was evaluated using two soil carbon models, CENTURY and YASSO, that were calibrated with forest plot empirical data in the Green Mountains of Vermont. Overall, 13 different harvesting scenarios...
Towards SDS (Strategic Defense System) Testing and Evaluation: A collection of Relevant Topics
1989-07-01
the proof of the next. 89 The Piton project is the first instance of stacking.two verified components. In 1985 Warren...Accelerated? In the long term, a vast amount of work needs to be done. Below are some miscellaneous, fairly near term projects which would seem to provide...and predictions for the current project . It provides a quantitative analysis of the environment and a model of the
NASA Astrophysics Data System (ADS)
Lohani, A. K.; Kumar, Rakesh; Singh, R. D.
2012-06-01
SummaryTime series modeling is necessary for the planning and management of reservoirs. More recently, the soft computing techniques have been used in hydrological modeling and forecasting. In this study, the potential of artificial neural networks and neuro-fuzzy system in monthly reservoir inflow forecasting are examined by developing and comparing monthly reservoir inflow prediction models, based on autoregressive (AR), artificial neural networks (ANNs) and adaptive neural-based fuzzy inference system (ANFIS). To take care the effect of monthly periodicity in the flow data, cyclic terms are also included in the ANN and ANFIS models. Working with time series flow data of the Sutlej River at Bhakra Dam, India, several ANN and adaptive neuro-fuzzy models are trained with different input vectors. To evaluate the performance of the selected ANN and adaptive neural fuzzy inference system (ANFIS) models, comparison is made with the autoregressive (AR) models. The ANFIS model trained with the input data vector including previous inflows and cyclic terms of monthly periodicity has shown a significant improvement in the forecast accuracy in comparison with the ANFIS models trained with the input vectors considering only previous inflows. In all cases ANFIS gives more accurate forecast than the AR and ANN models. The proposed ANFIS model coupled with the cyclic terms is shown to provide better representation of the monthly inflow forecasting for planning and operation of reservoir.
NASA Astrophysics Data System (ADS)
Zavodsky, B.; Le Roy, A.; Smith, M. R.; Case, J.
2016-12-01
In support of NASA's recently launched GPM `core' satellite, the NASA-SPoRT project is leveraging experience in research-to-operations transitions and training to provide feedback on the operational utility of GPM products. Thus far, SPoRT has focused on evaluating the Level 2 GPROF passive microwave and IMERG rain rate estimates. Formal evaluations with end-users have occurred, as well as internal evaluations of the datasets. One set of end users for these products is National Weather Service Forecast Offices (WFOs) and National Weather Service River Forecast Centers (RFCs), comprising forecasters and hydrologists. SPoRT has hosted a series of formal assessments to determine uses and utility of these datasets for NWS operations at specific offices. Forecasters primarily have used Level 2 swath rain rates to observe rainfall in otherwise data-void regions and to confirm model QPF for their nowcasting or short-term forecasting. Hydrologists have been evaluating both the Level 2 rain rates and the IMERG rain rates, including rain rate accumulations derived from IMERG; hydrologists have used these data to supplement gauge data for post-event analysis as well as for longer-term forecasting. Results from specific evaluations will be presented. Another evaluation of the GPM passive microwave rain rates has been in using the data within other products that are currently transitioned to end-users, rather than as stand-alone observations. For example, IMERG Early data is being used as a forcing mechanism in the NASA Land Information System (LIS) for real-time soil moisture product over eastern Africa. IMERG is providing valuable precipitation information to LIS in an otherwise data-void region. Results and caveats will briefly be discussed. A third application of GPM data is using the IMERG Late and Final products for model verification in remote regions where high-quality gridded precipitation fields are not readily available. These datasets can now be used to verify NWP model forecasts over Eastern Africa using the SPoRT-MET scripts verification package, a wrapper around the NCAR Model Evaluation Toolkit (MET) verification software.
Chagas, T R; Borges, D S; de Oliveira, P F; Mocellin, M C; Barbosa, A M; Camargo, C Q; Del Moral, J Â G; Poli, A; Calder, P C; Trindade, E B S M; Nunes, E A
2017-12-01
Studies suggest that the ingestion of fish oil (FO), a source of the omega-3 polyunsaturated fatty acids docosahexaenoic acid (DHA) and eicosapentaenoic acid (EPA), can reduce the deleterious side-effects of chemotherapy. The aim of this randomised clinical trial was to evaluate the effect of supplementation with oral FO for 9 weeks on nutritional parameters and inflammatory nutritional risk in patients with haematological malignancies during the beginning of chemotherapy. Twenty-two patients with leukaemia or lymphoma were randomised to the unsupplemented group (UG) (n = 13) or supplemented group (SG) (n = 9). SG received 2 g/day of fish oil for 9 weeks. Nutritional status, serum acute-phase proteins and plasma fatty acids were evaluated before (T0) and after (T1) the intervention period. Data were analysed using two models; model 1, comprising data from all patients included in the study, and model 2, comprising data from UG patients with no increase in the proportions of EPA and DHA in plasma and data from SG patients showing an at least 100% increase in plasma EPA and DHA. SG showed an increased plasma proportion of EPA and DHA in both models. In model 2, C-reactive protein (CRP) and CRP/albumin ratio showed larger reductions in the SG. Overall long-term survival in both models (465 days after the start of the chemotherapy) was higher in the group ingesting fish oil (P < 0.05). These findings indicate an improved nutritional-inflammatory risk and potential effects on long-term survival in patients with haematological malignancies supplemented with FO during the beginning of chemotherapy. © 2017 The British Dietetic Association Ltd.
Credit Risk Evaluation Using a C-Variable Least Squares Support Vector Classification Model
NASA Astrophysics Data System (ADS)
Yu, Lean; Wang, Shouyang; Lai, K. K.
Credit risk evaluation is one of the most important issues in financial risk management. In this paper, a C-variable least squares support vector classification (C-VLSSVC) model is proposed for credit risk analysis. The main idea of this model is based on the prior knowledge that different classes may have different importance for modeling and more weights should be given to those classes with more importance. The C-VLSSVC model can be constructed by a simple modification of the regularization parameter in LSSVC, whereby more weights are given to the lease squares classification errors with important classes than the lease squares classification errors with unimportant classes while keeping the regularized terms in its original form. For illustration purpose, a real-world credit dataset is used to test the effectiveness of the C-VLSSVC model.
Strengths and weaknesses of McNamara's evolutionary psychological model of dreaming.
Olliges, Sandra
2010-10-07
This article includes a brief overview of McNamara's (2004) evolutionary model of dreaming. The strengths and weaknesses of this model are then evaluated in terms of its consonance with measurable neurological and biological properties of dreaming, its fit within the tenets of evolutionary theories of dreams, and its alignment with evolutionary concepts of cooperation and spirituality. McNamara's model focuses primarily on dreaming that occurs during rapid eye movement (REM) sleep; therefore this article also focuses on REM dreaming.
Optimization of a reversible hood for protecting a pedestrian's head during car collisions.
Huang, Sunan; Yang, Jikuang
2010-07-01
This study evaluated and optimized the performance of a reversible hood (RH) for the prevention of the head injuries of an adult pedestrian from car collisions. The FE model of a production car front was introduced and validated. The baseline RH was developed from the original hood in the validated car front model. In order to evaluate the protective performance of the baseline RH, the FE models of an adult headform and a 50th percentile human head were used in parallel to impact the baseline RH. Based on the evaluation, the response surface method was applied to optimize the RH in terms of the material stiffness, lifting speed, and lifted height. Finally, the headform model and the human head model were again used to evaluate the protective performance of the optimized RH. It was found that the lifted baseline RH can obviously reduce the impact responses of the headform model and the human head model by comparing with the retracted and lifting baseline RH. When the optimized RH was lifted, the HIC values of the headform model and the human head model were further reduced to much lower than 1000. The risk of pedestrian head injuries can be prevented as required by EEVC WG17. Copyright 2009 Elsevier Ltd. All rights reserved.
Uncertainty quantification and optimal decisions
2017-01-01
A mathematical model can be analysed to construct policies for action that are close to optimal for the model. If the model is accurate, such policies will be close to optimal when implemented in the real world. In this paper, the different aspects of an ideal workflow are reviewed: modelling, forecasting, evaluating forecasts, data assimilation and constructing control policies for decision-making. The example of the oil industry is used to motivate the discussion, and other examples, such as weather forecasting and precision agriculture, are used to argue that the same mathematical ideas apply in different contexts. Particular emphasis is placed on (i) uncertainty quantification in forecasting and (ii) how decisions are optimized and made robust to uncertainty in models and judgements. This necessitates full use of the relevant data and by balancing costs and benefits into the long term may suggest policies quite different from those relevant to the short term. PMID:28484343
Long-term Kinetics of Uranyl Desorption from Sediments Under Advective Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, Jianying; Liu, Chongxuan; Wang, Zheming
2014-02-15
Long-term (> 4 months) column experiments were performed to investigate the kinetics of uranyl (U(VI)) desorption in sediments collected from the Integrated Field Research Challenge (IFRC) site at the US Department of Energy (DOE) Hanford 300 Area. The experimental results were used to evaluate alternative multi-rate surface complexation reaction (SCR) approaches to describe the short- and long-term kinetics of U(VI) desorption under flow conditions. The SCR stoichiometry, equilibrium constants, and multi-rate parameters were independently characterized in batch and stirred flow-cell reactors. Multi-rate SCR models that were either additively constructed using the SCRs for individual size fractions (e.g., Shang et al.,more » 2011), or composite in nature could effectively describe short-term U(VI) desorption under flow conditions. The long-term desorption results, however, revealed that using a labile U concentration measured by carbonate extraction under-estimated desorbable U(VI) and the long-term rate of U(VI) desorption. An alternative modeling approach using total U as the desorbable U(VI) concentration was proposed to overcome this difficulty. This study also found that the gravel size fraction (2-8 mm), which is typically treated as non-reactive in modeling U(VI) reactive transport because of low external surface area, can have an important effect on the U(VI) desorption in the sediment. This study demonstrates an approach to effectively extrapolate U(VI) desorption kinetics for field-scale application, and identifies important parameters and uncertainties affecting model predictions.« less
NASA Astrophysics Data System (ADS)
Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu
2006-07-01
Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.
NASA Astrophysics Data System (ADS)
Tang, Jing; Schurgers, Guy; Valolahti, Hanna; Faubert, Patrick; Tiiva, Päivi; Michelsen, Anders; Rinnan, Riikka
2016-12-01
The Arctic is warming at twice the global average speed, and the warming-induced increases in biogenic volatile organic compounds (BVOCs) emissions from Arctic plants are expected to be drastic. The current global models' estimations of minimal BVOC emissions from the Arctic are based on very few observations and have been challenged increasingly by field data. This study applied a dynamic ecosystem model, LPJ-GUESS, as a platform to investigate short-term and long-term BVOC emission responses to Arctic climate warming. Field observations in a subarctic tundra heath with long-term (13-year) warming treatments were extensively used for parameterizing and evaluating BVOC-related processes (photosynthesis, emission responses to temperature and vegetation composition). We propose an adjusted temperature (T) response curve for Arctic plants with much stronger T sensitivity than the commonly used algorithms for large-scale modelling. The simulated emission responses to 2 °C warming between the adjusted and original T response curves were evaluated against the observed warming responses (WRs) at short-term scales. Moreover, the model responses to warming by 4 and 8 °C were also investigated as a sensitivity test. The model showed reasonable agreement to the observed vegetation CO2 fluxes in the main growing season as well as day-to-day variability of isoprene and monoterpene emissions. The observed relatively high WRs were better captured by the adjusted T response curve than by the common one. During 1999-2012, the modelled annual mean isoprene and monoterpene emissions were 20 and 8 mg C m-2 yr-1, with an increase by 55 and 57 % for 2 °C summertime warming, respectively. Warming by 4 and 8 °C for the same period further elevated isoprene emission for all years, but the impacts on monoterpene emissions levelled off during the last few years. At hour-day scale, the WRs seem to be strongly impacted by canopy air T, while at the day-year scale, the WRs are a combined effect of plant functional type (PFT) dynamics and instantaneous BVOC responses to warming. The identified challenges in estimating Arctic BVOC emissions are (1) correct leaf T estimation, (2) PFT parameterization accounting for plant emission features as well as physiological responses to warming, and (3) representation of long-term vegetation changes in the past and the future.
Thorn, Annabel S C; Gathercole, Susan E; Frankish, Clive R
2005-03-01
The impact of four long-term knowledge variables on serial recall accuracy was investigated. Serial recall was tested for high and low frequency words and high and low phonotactic frequency nonwords in 2 groups: monolingual English speakers and French-English bilinguals. For both groups the recall advantage for words over nonwords reflected more fully correct recalls with fewer recall attempts that consisted of fragments of the target memory items (one or two of the three target phonemes recalled correctly); completely incorrect recalls were equivalent for the 2 list types. However, word frequency (for both groups), nonword phonotactic frequency (for the monolingual group), and language familiarity all influenced the proportions of completely incorrect recalls that were made. These results are not consistent with the view that long-term knowledge influences on immediate recall accuracy can be exclusively attributed to a redintegration process of the type specified in multinomial processing tree model of immediate recall. The finding of a differential influence on completely incorrect recalls of these four long-term knowledge variables suggests instead that the beneficial effects of long-term knowledge on short-term recall accuracy are mediated by more than one mechanism.
Anomaa Senaviratne, G M M M; Udawatta, Ranjith P; Baffaut, Claire; Anderson, Stephen H
2013-01-01
The Agricultural Policy Environmental Extender (APEX) model is used to evaluate best management practices on pollutant loading in whole farms or small watersheds. The objectives of this study were to conduct a sensitivity analysis to determine the effect of model parameters on APEX output and use the parameterized, calibrated, and validated model to evaluate long-term benefits of grass waterways. The APEX model was used to model three (East, Center, and West) adjacent field-size watersheds with claypan soils under a no-till corn ( L.)/soybean [ (L.) Merr.] rotation. Twenty-seven parameters were sensitive for crop yield, runoff, sediment, nitrogen (dissolved and total), and phosphorous (dissolved and total) simulations. The model was calibrated using measured event-based data from the Center watershed from 1993 to 1997 and validated with data from the West and East watersheds. Simulated crop yields were within ±13% of the measured yield. The model performance for event-based runoff was excellent, with calibration and validation > 0.9 and Nash-Sutcliffe coefficients (NSC) > 0.8, respectively. Sediment and total nitrogen calibration results were satisfactory for larger rainfall events (>50 mm), with > 0.5 and NSC > 0.4, but validation results remained poor, with NSC between 0.18 and 0.3. Total phosphorous was well calibrated and validated, with > 0.8 and NSC > 0.7, respectively. The presence of grass waterways reduced annual total phosphorus loadings by 13 to 25%. The replicated study indicates that APEX provides a convenient and efficient tool to evaluate long-term benefits of conservation practices. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Cerebellarlike corrective model inference engine for manipulation tasks.
Luque, Niceto Rafael; Garrido, Jesús Alberto; Carrillo, Richard Rafael; Coenen, Olivier J-M D; Ros, Eduardo
2011-10-01
This paper presents how a simple cerebellumlike architecture can infer corrective models in the framework of a control task when manipulating objects that significantly affect the dynamics model of the system. The main motivation of this paper is to evaluate a simplified bio-mimetic approach in the framework of a manipulation task. More concretely, the paper focuses on how the model inference process takes place within a feedforward control loop based on the cerebellar structure and on how these internal models are built up by means of biologically plausible synaptic adaptation mechanisms. This kind of investigation may provide clues on how biology achieves accurate control of non-stiff-joint robot with low-power actuators which involve controlling systems with high inertial components. This paper studies how a basic temporal-correlation kernel including long-term depression (LTD) and a constant long-term potentiation (LTP) at parallel fiber-Purkinje cell synapses can effectively infer corrective models. We evaluate how this spike-timing-dependent plasticity correlates sensorimotor activity arriving through the parallel fibers with teaching signals (dependent on error estimates) arriving through the climbing fibers from the inferior olive. This paper addresses the study of how these LTD and LTP components need to be well balanced with each other to achieve accurate learning. This is of interest to evaluate the relevant role of homeostatic mechanisms in biological systems where adaptation occurs in a distributed manner. Furthermore, we illustrate how the temporal-correlation kernel can also work in the presence of transmission delays in sensorimotor pathways. We use a cerebellumlike spiking neural network which stores the corrective models as well-structured weight patterns distributed among the parallel fibers to Purkinje cell connections.
NASA Technical Reports Server (NTRS)
Murray, Lee T.; Fiore, Arlene M.
2014-01-01
Over four decades of measurements exist that sample the 3-D composition of reactive trace gases in the troposphere from approximately weekly ozone sondes, instrumentation on civil aircraft, and individual comprehensive aircraft field campaigns. An obstacle to using these data to evaluate coupled chemistry-climate models (CCMs)the models used to project future changes in atmospheric composition and climateis that exact space-time matching between model fields and observations cannot be done, as CCMs generate their own meteorology. Evaluation typically involves averaging over large spatiotemporal regions, which may not reflect a true average due to limited or biased sampling. This averaging approach generally loses information regarding specific processes. Here we aim to identify where discrete sampling may be indicative of long-term mean conditions, using the GEOS-Chem global chemical-transport model (CTM) driven by the MERRA reanalysis to reflect historical meteorology from 2003 to 2012 at 2o by 2.5o resolution. The model has been sampled at the time and location of every ozone sonde profile available from the Would Ozone and Ultraviolet Radiation Data Centre (WOUDC), along the flight tracks of the IAGOSMOZAICCARABIC civil aircraft campaigns, as well as those from over 20 individual field campaigns performed by NASA, NOAA, DOE, NSF, NERC (UK), and DLR (Germany) during the simulation period. Focusing on ozone, carbon monoxide and reactive nitrogen species, we assess where aggregates of the in situ data are representative of the decadal mean vertical, spatial and temporal distributions that would be appropriate for evaluating CCMs. Next, we identically sample a series of parallel sensitivity simulations in which individual emission sources (e.g., lightning, biogenic VOCs, wildfires, US anthropogenic) have been removed one by one, to assess where and when the aggregated observations may offer constraints on these processes within CCMs. Lastly, we show results of an additional 31-year simulation from 1980-2010 of GEOS-Chem driven by the MACCity emissions inventory and MERRA reanalysis at 4o by 5o. We sample the model at every WOUDC sonde and flight track from MOZAIC and NASA field campaigns to evaluate which aggregate observations are statistically reflective of long-term trends over the period.
Li, Xiang; Peng, Ling; Yao, Xiaojing; Cui, Shaolong; Hu, Yuan; You, Chengzeng; Chi, Tianhe
2017-12-01
Air pollutant concentration forecasting is an effective method of protecting public health by providing an early warning against harmful air pollutants. However, existing methods of air pollutant concentration prediction fail to effectively model long-term dependencies, and most neglect spatial correlations. In this paper, a novel long short-term memory neural network extended (LSTME) model that inherently considers spatiotemporal correlations is proposed for air pollutant concentration prediction. Long short-term memory (LSTM) layers were used to automatically extract inherent useful features from historical air pollutant data, and auxiliary data, including meteorological data and time stamp data, were merged into the proposed model to enhance the performance. Hourly PM 2.5 (particulate matter with an aerodynamic diameter less than or equal to 2.5 μm) concentration data collected at 12 air quality monitoring stations in Beijing City from Jan/01/2014 to May/28/2016 were used to validate the effectiveness of the proposed LSTME model. Experiments were performed using the spatiotemporal deep learning (STDL) model, the time delay neural network (TDNN) model, the autoregressive moving average (ARMA) model, the support vector regression (SVR) model, and the traditional LSTM NN model, and a comparison of the results demonstrated that the LSTME model is superior to the other statistics-based models. Additionally, the use of auxiliary data improved model performance. For the one-hour prediction tasks, the proposed model performed well and exhibited a mean absolute percentage error (MAPE) of 11.93%. In addition, we conducted multiscale predictions over different time spans and achieved satisfactory performance, even for 13-24 h prediction tasks (MAPE = 31.47%). Copyright © 2017 Elsevier Ltd. All rights reserved.
Ryberg, Karen R.; Vecchia, Aldo V.; Akyüz, F. Adnan; Lin, Wei
2016-01-01
Historically unprecedented flooding occurred in the Souris River Basin of Saskatchewan, North Dakota and Manitoba in 2011, during a longer term period of wet conditions in the basin. In order to develop a model of future flows, there is a need to evaluate effects of past multidecadal climate variability and/or possible climate change on precipitation. In this study, tree-ring chronologies and historical precipitation data in a four-degree buffer around the Souris River Basin were analyzed to develop regression models that can be used for predicting long-term variations of precipitation. To focus on longer term variability, 12-year moving average precipitation was modeled in five subregions (determined through cluster analysis of measures of precipitation) of the study area over three seasons (November–February, March–June and July–October). The models used multiresolution decomposition (an additive decomposition based on powers of two using a discrete wavelet transform) of tree-ring chronologies from Canada and the US and seasonal 12-year moving average precipitation based on Adjusted and Homogenized Canadian Climate Data and US Historical Climatology Network data. Results show that precipitation varies on long-term (multidecadal) time scales of 16, 32 and 64 years. Past extended pluvial and drought events, which can vary greatly with season and subregion, were highlighted by the models. Results suggest that the recent wet period may be a part of natural variability on a very long time scale.
Multisite evaluation of APEX for water quality: II. Regional parameterization
USDA-ARS?s Scientific Manuscript database
Phosphorus (P) index assessment requires independent estimates of long-term average annual P loss from multiple locations, management practices, soils, and landscape positions. Because currently available measured data are insufficient, calibrated and validated process-based models have been propos...
Monte Carlo simulations for generic granite repository studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, Shaoping; Lee, Joon H; Wang, Yifeng
In a collaborative study between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL) for the DOE-NE Office of Fuel Cycle Technologies Used Fuel Disposition (UFD) Campaign project, we have conducted preliminary system-level analyses to support the development of a long-term strategy for geologic disposal of high-level radioactive waste. A general modeling framework consisting of a near- and a far-field submodel for a granite GDSE was developed. A representative far-field transport model for a generic granite repository was merged with an integrated systems (GoldSim) near-field model. Integrated Monte Carlo model runs with the combined near- and farfield transport modelsmore » were performed, and the parameter sensitivities were evaluated for the combined system. In addition, a sub-set of radionuclides that are potentially important to repository performance were identified and evaluated for a series of model runs. The analyses were conducted with different waste inventory scenarios. Analyses were also conducted for different repository radionuelide release scenarios. While the results to date are for a generic granite repository, the work establishes the method to be used in the future to provide guidance on the development of strategy for long-term disposal of high-level radioactive waste in a granite repository.« less
Hargreaves, James; Hatcher, Abigail; Strange, Vicki; Phetla, Godfrey; Busza, Joanna; Kim, Julia; Watts, Charlotte; Morison, Linda; Porter, John; Pronyk, Paul; Bonell, Christopher
2010-02-01
The Intervention with Microfinance for AIDS and Gender Equity (IMAGE) combines microfinance, gender/HIV training and community mobilization (CM) in South Africa. A trial found reduced intimate partner violence among clients but less evidence for impact on sexual behaviour among clients' households or communities. This process evaluation examined how feasible IMAGE was to deliver and how accessible and acceptable it was to intended beneficiaries during a trial and subsequent scale-up. Data came from attendance registers, financial records, observations, structured questionnaires (378) and focus group discussions and interviews (128) with clients and staff. Gender/HIV training and CM were managed initially by an academic unit ('linked' model) and later by the microfinance institution (MFI) ('parallel' model). Microfinance and gender/HIV training were feasible to deliver and accessible and acceptable to most clients. Though participation in CM was high for some clients, others experienced barriers to collective action, a finding which may help explain lack of intervention effects among household/community members. Delivery was feasible in the short term but both models were considered unsustainable in the longer term. A linked model involving a MFI and a non-academic partner agency may be more sustainable and is being tried. Feasible models for delivering microfinance and health promotion require further investigation.
Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc
2018-05-01
Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.
Pollock, Richard F; Chubb, Barrie; Valentine, William J; Heller, Simon
2018-01-01
To estimate the short-term cost-effectiveness of insulin detemir (IDet) versus neutral protamine Hagedorn (NPH) insulin based on the incidence of non-severe hypoglycemia and changes in body weight in subjects with type 1 diabetes (T1D) or type 2 diabetes (T2D) in the UK. A model was developed to evaluate cost-effectiveness based on non-severe hypoglycemia, body mass index, and pharmacy costs over 1 year. Published rates of non-severe hypoglycemia were employed in the T1D and T2D analyses, while reduced weight gain with IDet was modeled in the T2D analysis only. Effectiveness was calculated in terms of quality-adjusted life expectancy using published utility scores. Pharmacy costs were captured using published prices and defined daily doses. Costs were expressed in 2016 pounds sterling (GBP). Sensitivity analyses were performed (including probabilistic sensitivity analysis). In T1D, IDet was associated with fewer non-severe hypoglycemic events than NPH insulin (126.7 versus 150.8 events per person-year), leading to an improvement of 0.099 quality-adjusted life years (QALYs). Costs with IDet were GBP 60 higher, yielding an incremental cost-effectiveness ratio (ICER) of GBP 610 per QALY gained. In T2D, mean non-severe hypoglycemic event rates and body weight were lower with IDet than NPH insulin, leading to a total incremental utility of 0.120, accompanied by an annual cost increase of GBP 171, yielding an ICER of GBP 1,422 per QALY gained for IDet versus NPH insulin. Short-term health economic evaluation showed IDet to be a cost-effective alternative to NPH insulin in the UK due to lower rates of non-severe hypoglycemia (T1D and T2D) and reduced weight gain (T2D only).
NASA Astrophysics Data System (ADS)
Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen
2015-08-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.
Spontaneously hypertensive rat (SHR) as an animal model for ADHD: a short overview.
Meneses, Alfredo; Perez-Garcia, Georgina; Ponce-Lopez, Teresa; Tellez, Ruth; Gallegos-Cari, Andrea; Castillo, Carlos
2011-01-01
Diverse studies indicate that attention-deficit hyperactivity disorder (ADHD) is associated with alterations in encoding processes, including working or short-term memory. Some ADHD dysfunctional domains are reflected in the spontaneously hypertensive rat (SHR). Because ADHD, drugs and animal models are eliciting a growing interest, hence the aim of this work is to present a brief overview with a focus on the SHR as an animal model for ADHD and memory deficits. Thus, this paper reviews the concept of SHR as a model system for ADHD, comparing SHR, Wistar-Kyoto and Sprague-Dawley rats with a focus on the hypertension level and working, short-term memory and attention in different behavioral tasks, such as open field, five choice serial reaction time, water maze, passive avoidance, and autoshaping. In addition, drug treatments (d-amphetamine and methylphenidate) are evaluated.
NASA Astrophysics Data System (ADS)
Jeong, Chan-Yong; Kim, Hee-Joong; Hong, Sae-Young; Song, Sang-Hun; Kwon, Hyuck-In
2017-08-01
In this study, we show that the two-stage unified stretched-exponential model can more exactly describe the time-dependence of threshold voltage shift (ΔV TH) under long-term positive-bias-stresses compared to the traditional stretched-exponential model in amorphous indium-gallium-zinc oxide (a-IGZO) thin-film transistors (TFTs). ΔV TH is mainly dominated by electron trapping at short stress times, and the contribution of trap state generation becomes significant with an increase in the stress time. The two-stage unified stretched-exponential model can provide useful information not only for evaluating the long-term electrical stability and lifetime of the a-IGZO TFT but also for understanding the stress-induced degradation mechanism in a-IGZO TFTs.
Use of Climatic Information In Regional Water Resources Assessment
NASA Astrophysics Data System (ADS)
Claps, P.
Relations between climatic parameters and hydrological variables at the basin scale are investigated, with the aim of evaluating in a parsimonious way physical parameters useful both for a climatic classification of an area and for supporting statistical models of water resources assessment. With reference to the first point, literature methods for distributed evaluation of parameters such as temperature, global and net solar radiation, precipitation, have been considered at the annual scale with the aim of considering the viewpoint of the robust evaluation of parameters based on few basic physical variables of simple determination. Elevation, latitude and average annual number of sunny days have demonstrated to be the essential parameters with respect to the evaluation of climatic indices related to the soil water deficit and to the radiative balance. The latter term was evaluated at the monthly scale and validated (in the `global' term) with measured data. in questo caso riferite al bilancio idrico a scala annuale. Budyko, Thornthwaite and Emberger climatic indices were evaluated on the 10,000 km2 territory of the Basilicata region (southern Italy) based on a 1.1. km grid. They were compared in terms of spatial variability and sensitivity to the variation of the basic variables in humid and semi-arid areas. The use of the climatic index data with respect to statistical parameters of the runoff series in some gauging stations of the region demonstrated the possibility to support regionalisation of the annual runoff using climatic information, with clear distinction of the variability of the coefficient of variation in terms of the humidity-aridity of the basin.
The data quality analyzer: a quality control program for seismic data
Ringler, Adam; Hagerty, M.T.; Holland, James F.; Gonzales, A.; Gee, Lind S.; Edwards, J.D.; Wilson, David; Baker, Adam
2015-01-01
The quantification of data quality is based on the evaluation of various metrics (e.g., timing quality, daily noise levels relative to long-term noise models, and comparisons between broadband data and event synthetics). Users may select which metrics contribute to the assessment and those metrics are aggregated into a “grade” for each station. The DQA is being actively used for station diagnostics and evaluation based on the completed metrics (availability, gap count, timing quality, deviation from a global noise model, deviation from a station noise model, coherence between co-located sensors, and comparison between broadband data and synthetics for earthquakes) on stations in the Global Seismographic Network and Advanced National Seismic System.
Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)
NASA Astrophysics Data System (ADS)
Jordan, T. H.
2010-12-01
The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and they need to convey the epistemic uncertainties in the operational forecasts. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. All operational procedures should be rigorously reviewed by experts in the creation, delivery, and utility of earthquake forecasts. (c) The quality of all operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing in a CSEP-type environment against established long-term forecasts and a wide variety of alternative, time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in PSHA. (e) Alert procedures should be standardized to facilitate decisions at different levels of government and among the public, based in part on objective analysis of costs and benefits. (f) In establishing alert procedures, consideration should also be made of the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that can lead to informal predictions and misinformation.
Backcasting long-term climate data: evaluation of hypothesis
NASA Astrophysics Data System (ADS)
Saghafian, Bahram; Aghbalaghi, Sara Ghasemi; Nasseri, Mohsen
2018-05-01
Most often than not, incomplete datasets or short-term recorded data in vast regions impedes reliable climate and water studies. Various methods, such as simple correlation with stations having long-term time series, are practiced to infill or extend the period of observation at stations with missing or short-term data. In the current paper and for the first time, the hypothesis on the feasibility of extending the downscaling concept to backcast local observation records using large-scale atmospheric predictors is examined. Backcasting is coined here to contrast forecasting/projection; the former is implied to reconstruct in the past, while the latter represents projection in the future. To assess our hypotheses, daily and monthly statistical downscaling models were employed to reconstruct past precipitation data and lengthen the data period. Urmia and Tabriz synoptic stations, located in northwestern Iran, constituted two case study stations. SDSM and data-mining downscaling model (DMDM) daily as well as the group method of data handling (GMDH) and model tree (Mp5) monthly downscaling models were trained with National Center for Environmental Prediction (NCEP) data. After training, reconstructed precipitation data of the past was validated against observed data. Then, the data was fully extended to the 1948 to 2009 period corresponding to available NCEP data period. The results showed that DMDM performed superior in generation of monthly average precipitation compared with the SDSM, Mp5, and GMDH models, although none of the models could preserve the monthly variance. This overall confirms practical value of the proposed approach in extension of the past historic data, particularly for long-term climatological and water budget studies.
A predictive framework for evaluating models of semantic organization in free recall
Morton, Neal W; Polyn, Sean M.
2016-01-01
Research in free recall has demonstrated that semantic associations reliably influence the organization of search through episodic memory. However, the specific structure of these associations and the mechanisms by which they influence memory search remain unclear. We introduce a likelihood-based model-comparison technique, which embeds a model of semantic structure within the context maintenance and retrieval (CMR) model of human memory search. Within this framework, model variants are evaluated in terms of their ability to predict the specific sequence in which items are recalled. We compare three models of semantic structure, latent semantic analysis (LSA), global vectors (GloVe), and word association spaces (WAS), and find that models using WAS have the greatest predictive power. Furthermore, we find evidence that semantic and temporal organization is driven by distinct item and context cues, rather than a single context cue. This finding provides important constraint for theories of memory search. PMID:28331243
Kentel, Behzat B; King, Mark A; Mitchell, Sean R
2011-11-01
A torque-driven, subject-specific 3-D computer simulation model of the impact phase of one-handed tennis backhand strokes was evaluated by comparing performance and simulation results. Backhand strokes of an elite subject were recorded on an artificial tennis court. Over the 50-ms period after impact, good agreement was found with an overall RMS difference of 3.3° between matching simulation and performance in terms of joint and racket angles. Consistent with previous experimental research, the evaluation process showed that grip tightness and ball impact location are important factors that affect postimpact racket and arm kinematics. Associated with these factors, the model can be used for a better understanding of the eccentric contraction of the wrist extensors during one-handed backhand ground strokes, a hypothesized mechanism of tennis elbow.
NASA Astrophysics Data System (ADS)
Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.
2017-01-01
When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.
Adams, Matthew P; Collier, Catherine J; Uthicke, Sven; Ow, Yan X; Langlois, Lucas; O'Brien, Katherine R
2017-01-04
When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (T opt ) for maximum photosynthetic rate (P max ). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.
Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O’Brien, Katherine R.
2017-01-01
When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike. PMID:28051123
Understanding and quantifying foliar temperature acclimation for Earth System Models
NASA Astrophysics Data System (ADS)
Smith, N. G.; Dukes, J.
2015-12-01
Photosynthesis and respiration on land are the two largest carbon fluxes between the atmosphere and Earth's surface. The parameterization of these processes represent major uncertainties in the terrestrial component of the Earth System Models used to project future climate change. Research has shown that much of this uncertainty is due to the parameterization of the temperature responses of leaf photosynthesis and autotrophic respiration, which are typically based on short-term empirical responses. Here, we show that including longer-term responses to temperature, such as temperature acclimation, can help to reduce this uncertainty and improve model performance, leading to drastic changes in future land-atmosphere carbon feedbacks across multiple models. However, these acclimation formulations have many flaws, including an underrepresentation of many important global flora. In addition, these parameterizations were done using multiple studies that employed differing methodology. As such, we used a consistent methodology to quantify the short- and long-term temperature responses of maximum Rubisco carboxylation (Vcmax), maximum rate of Ribulos-1,5-bisphosphate regeneration (Jmax), and dark respiration (Rd) in multiple species representing each of the plant functional types used in global-scale land surface models. Short-term temperature responses of each process were measured in individuals acclimated for 7 days at one of 5 temperatures (15-35°C). The comparison of short-term curves in plants acclimated to different temperatures were used to evaluate long-term responses. Our analyses indicated that the instantaneous response of each parameter was highly sensitive to the temperature at which they were acclimated. However, we found that this sensitivity was larger in species whose leaves typically experience a greater range of temperatures over the course of their lifespan. These data indicate that models using previous acclimation formulations are likely incorrectly simulating leaf carbon exchange responses to future warming. Therefore, our data, if used to parameterize large-scale models, are likely to provide an even greater improvement in model performance, resulting in more reliable projections of future carbon-clime feedbacks.
Dynamic Evaluation of Two Decades of CMAQ Simulations ...
This presentation focuses on the dynamic evaluation of the CMAQ model over the continental United States using multi-decadal simulations for the period from 1990 to 2010 to examine how well the changes in observed ozone air quality induced by variations in meteorology and/or emissions are simulated by the model. We applied spectral decomposition of the ozone time-series using the KZ filter to assess the variations in the strengths of synoptic (weather-induced variations) and baseline (long-term variation) forcings, embedded in the simulated and observed concentrations. The results reveal that CMAQ captured the year-to-year variability (more so in the later years than the earlier years) and the synoptic forcing in accordance with what the observations are showing. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Recovery as a model of care? Insights from an Australian case study.
Hungerford, Catherine
2014-03-01
The terms "model of health care," "service model." and "nursing model of practice" are often used interchangeably in practice, policy, and research, despite differences in definitions. This article considers these terms in the context of consumer-centred recovery and its implementation into a publicly-funded health service organization in Australia. Findings of a case study analysis are used to inform the discussion, which considers the diverse models of health care employed by health professionals; together with the implications for organizations worldwide that are responsible for operationalizing recovery approaches to health care. As part of the discussion, it is suggested that the advent of recovery-oriented services, rather than recovery models of health care, presents challenges for the evaluation of the outcomes of these services. At the same time, this situation provides opportunities for mental health nurses to lead the way, by developing rigorous models of practice that support consumers who have acute, chronic, or severe mental illness on their recovery journey; and generate positive, measureable outcomes.
Evaluation and prediction of long-term environmental effects of nonmetallic materials
NASA Technical Reports Server (NTRS)
Papazian, H.
1985-01-01
The properties of a number of nonmetallic materials were evaluated experimentally in simulated space environments in order to develop models for accelerated test methods useful for predicting such behavioral changes. Graphite-epoxy composites were exposed to thermal cycling. Adhesive foam tapes were subjected to a vacuum environment. Metal-matrix composites were tested for baseline data. Predictive modeling designed to include strength and aging effects on composites, polymeric films, and metals under such space conditions (including the atomic oxygen environment) is discussed. The Korel 8031-00 high strength adhesive foam tape was shown to be superior to the other two tested.
Cost-effectiveness of enhanced recovery in hip and knee replacement: a systematic review protocol.
Murphy, Jacqueline; Pritchard, Mark G; Cheng, Lok Yin; Janarthanan, Roshni; Leal, José
2018-03-14
Hip and knee replacement represents a significant burden to the UK healthcare system. 'Enhanced recovery' pathways have been introduced in the National Health Service (NHS) for patients undergoing hip and knee replacement, with the aim of improving outcomes and timely recovery after surgery. To support policymaking, there is a need to evaluate the cost-effectiveness of enhanced recovery pathways across jurisdictions. Our aim is to systematically summarise the published cost-effectiveness evidence on enhanced recovery in hip and knee replacement, both as a whole and for each of the various components of enhanced recovery pathways. A systematic review will be conducted using MEDLINE, EMBASE, Econlit and the National Health Service Economic Evaluations Database. Separate search strategies were developed for each database including terms relating to hip and knee replacement/arthroplasty, economic evaluations, decision modelling and quality of life measures.We will extract peer-reviewed studies published between 2000 and 2017 reporting economic evaluations of preoperative, perioperative or postoperative enhanced recovery interventions within hip or knee replacement. Economic evaluations alongside cohort studies or based on decision models will be included. Only studies with patients undergoing elective replacement surgery of the hip or knee will be included. Data will be extracted using a predefined pro forma following best practice guidelines for economic evaluation, decision modelling and model validation.Our primary outcome will be the cost-effectiveness of enhanced recovery (entire pathway and individual components) in terms of incremental cost per quality-adjusted life year. A narrative synthesis of all studies will be presented, focussing on cost-effectiveness results, study design, quality and validation status. This systematic review is exempted from ethics approval because the work is carried out on published documents. The results of the review will be disseminated in a peer-reviewed academic journal and at conferences. CRD42017059473. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Cost-effectiveness of enhanced recovery in hip and knee replacement: a systematic review protocol
Pritchard, Mark G; Cheng, Lok Yin; Janarthanan, Roshni
2018-01-01
Introduction Hip and knee replacement represents a significant burden to the UK healthcare system. ‘Enhanced recovery’ pathways have been introduced in the National Health Service (NHS) for patients undergoing hip and knee replacement, with the aim of improving outcomes and timely recovery after surgery. To support policymaking, there is a need to evaluate the cost-effectiveness of enhanced recovery pathways across jurisdictions. Our aim is to systematically summarise the published cost-effectiveness evidence on enhanced recovery in hip and knee replacement, both as a whole and for each of the various components of enhanced recovery pathways. Methods and analysis A systematic review will be conducted using MEDLINE, EMBASE, Econlit and the National Health Service Economic Evaluations Database. Separate search strategies were developed for each database including terms relating to hip and knee replacement/arthroplasty, economic evaluations, decision modelling and quality of life measures. We will extract peer-reviewed studies published between 2000 and 2017 reporting economic evaluations of preoperative, perioperative or postoperative enhanced recovery interventions within hip or knee replacement. Economic evaluations alongside cohort studies or based on decision models will be included. Only studies with patients undergoing elective replacement surgery of the hip or knee will be included. Data will be extracted using a predefined pro forma following best practice guidelines for economic evaluation, decision modelling and model validation. Our primary outcome will be the cost-effectiveness of enhanced recovery (entire pathway and individual components) in terms of incremental cost per quality-adjusted life year. A narrative synthesis of all studies will be presented, focussing on cost-effectiveness results, study design, quality and validation status. Ethics and dissemination This systematic review is exempted from ethics approval because the work is carried out on published documents. The results of the review will be disseminated in a peer-reviewed academic journal and at conferences. PROSPERO registration number CRD42017059473. PMID:29540418
Moraghebi, Roksana; Kirkeby, Agnete; Chaves, Patricia; Rönn, Roger E; Sitnicka, Ewa; Parmar, Malin; Larsson, Marcus; Herbst, Andreas; Woods, Niels-Bjarne
2017-08-25
Mesenchymal stromal cells (MSCs) are currently being evaluated in numerous pre-clinical and clinical cell-based therapy studies. Furthermore, there is an increasing interest in exploring alternative uses of these cells in disease modelling, pharmaceutical screening, and regenerative medicine by applying reprogramming technologies. However, the limited availability of MSCs from various sources restricts their use. Term amniotic fluid has been proposed as an alternative source of MSCs. Previously, only low volumes of term fluid and its cellular constituents have been collected, and current knowledge of the MSCs derived from this fluid is limited. In this study, we collected amniotic fluid at term using a novel collection system and evaluated amniotic fluid MSC content and their characteristics, including their feasibility to undergo cellular reprogramming. Amniotic fluid was collected at term caesarean section deliveries using a closed catheter-based system. Following fluid processing, amniotic fluid was assessed for cellularity, MSC frequency, in-vitro proliferation, surface phenotype, differentiation, and gene expression characteristics. Cells were also reprogrammed to the pluripotent stem cell state and differentiated towards neural and haematopoietic lineages. The average volume of term amniotic fluid collected was approximately 0.4 litres per donor, containing an average of 7 million viable mononuclear cells per litre, and a CFU-F content of 15 per 100,000 MNCs. Expanded CFU-F cultures showed similar surface phenotype, differentiation potential, and gene expression characteristics to MSCs isolated from traditional sources, and showed extensive expansion potential and rapid doubling times. Given the high proliferation rates of these neonatal source cells, we assessed them in a reprogramming application, where the derived induced pluripotent stem cells showed multigerm layer lineage differentiation potential. The potentially large donor base from caesarean section deliveries, the high yield of term amniotic fluid MSCs obtainable, the properties of the MSCs identified, and the suitability of the cells to be reprogrammed into the pluripotent state demonstrated these cells to be a promising and plentiful resource for further evaluation in bio-banking, cell therapy, disease modelling, and regenerative medicine applications.
Automatic anatomy recognition via multiobject oriented active shape models.
Chen, Xinjian; Udupa, Jayaram K; Alavi, Abass; Torigian, Drew A
2010-12-01
This paper studies the feasibility of developing an automatic anatomy recognition (AAR) system in clinical radiology and demonstrates its operation on clinical 2D images. The anatomy recognition method described here consists of two main components: (a) multiobject generalization of OASM and (b) object recognition strategies. The OASM algorithm is generalized to multiple objects by including a model for each object and assigning a cost structure specific to each object in the spirit of live wire. The delineation of multiobject boundaries is done in MOASM via a three level dynamic programming algorithm, wherein the first level is at pixel level which aims to find optimal oriented boundary segments between successive landmarks, the second level is at landmark level which aims to find optimal location for the landmarks, and the third level is at the object level which aims to find optimal arrangement of object boundaries over all objects. The object recognition strategy attempts to find that pose vector (consisting of translation, rotation, and scale component) for the multiobject model that yields the smallest total boundary cost for all objects. The delineation and recognition accuracies were evaluated separately utilizing routine clinical chest CT, abdominal CT, and foot MRI data sets. The delineation accuracy was evaluated in terms of true and false positive volume fractions (TPVF and FPVF). The recognition accuracy was assessed (1) in terms of the size of the space of the pose vectors for the model assembly that yielded high delineation accuracy, (2) as a function of the number of objects and objects' distribution and size in the model, (3) in terms of the interdependence between delineation and recognition, and (4) in terms of the closeness of the optimum recognition result to the global optimum. When multiple objects are included in the model, the delineation accuracy in terms of TPVF can be improved to 97%-98% with a low FPVF of 0.1%-0.2%. Typically, a recognition accuracy of > or = 90% yielded a TPVF > or = 95% and FPVF < or = 0.5%. Over the three data sets and over all tested objects, in 97% of the cases, the optimal solutions found by the proposed method constituted the true global optimum. The experimental results showed the feasibility and efficacy of the proposed automatic anatomy recognition system. Increasing the number of objects in the model can significantly improve both recognition and delineation accuracy. More spread out arrangement of objects in the model can lead to improved recognition and delineation accuracy. Including larger objects in the model also improved recognition and delineation. The proposed method almost always finds globally optimum solutions.
NASA Astrophysics Data System (ADS)
Verfaillie, Deborah; Déqué, Michel; Morin, Samuel; Lafaysse, Matthieu
2017-11-01
We introduce the method ADAMONT v1.0 to adjust and disaggregate daily climate projections from a regional climate model (RCM) using an observational dataset at hourly time resolution. The method uses a refined quantile mapping approach for statistical adjustment and an analogous method for sub-daily disaggregation. The method ultimately produces adjusted hourly time series of temperature, precipitation, wind speed, humidity, and short- and longwave radiation, which can in turn be used to force any energy balance land surface model. While the method is generic and can be employed for any appropriate observation time series, here we focus on the description and evaluation of the method in the French mountainous regions. The observational dataset used here is the SAFRAN meteorological reanalysis, which covers the entire French Alps split into 23 massifs, within which meteorological conditions are provided for several 300 m elevation bands. In order to evaluate the skills of the method itself, it is applied to the ALADIN-Climate v5 RCM using the ERA-Interim reanalysis as boundary conditions, for the time period from 1980 to 2010. Results of the ADAMONT method are compared to the SAFRAN reanalysis itself. Various evaluation criteria are used for temperature and precipitation but also snow depth, which is computed by the SURFEX/ISBA-Crocus model using the meteorological driving data from either the adjusted RCM data or the SAFRAN reanalysis itself. The evaluation addresses in particular the time transferability of the method (using various learning/application time periods), the impact of the RCM grid point selection procedure for each massif/altitude band configuration, and the intervariable consistency of the adjusted meteorological data generated by the method. Results show that the performance of the method is satisfactory, with similar or even better evaluation metrics than alternative methods. However, results for air temperature are generally better than for precipitation. Results in terms of snow depth are satisfactory, which can be viewed as indicating a reasonably good intervariable consistency of the meteorological data produced by the method. In terms of temporal transferability (evaluated over time periods of 15 years only), results depend on the learning period. In terms of RCM grid point selection technique, the use of a complex RCM grid points selection technique, taking into account horizontal but also altitudinal proximity to SAFRAN massif centre points/altitude couples, generally degrades evaluation metrics for high altitudes compared to a simpler grid point selection method based on horizontal distance.
Sugimoto, Masanori; Toda, Yoshihisa; Hori, Miyuki; Mitani, Akiko; Ichihara, Takahiro; Sekine, Shingo; Kaku, Shinsuke; Otsuka, Noboru; Matsumoto, Hideo
2016-06-01
Preclinical Research The aim of this study was to evaluate the efficacy of multiple applications of S(+)-flurbiprofen plaster (SFPP), a novel Nonsteroidal anti-inflammatory drug (NSAID) patch, for the alleviation of inflammatory pain and edema in rat adjuvant-induced arthritis (AIA) model as compared to other NSAID patches. The AIA model was induced by the injection of Mycobacterium butyricum and rats were treated with a patch (1.0 cm × 0.88 cm) containing each NSAID (SFP, ketoprofen, loxoprofen, diclofenac, felbinac, flurbiprofen, or indomethacin) applied to the paw for 6 h per day for 5 days. The pain threshold was evaluated using a flexion test of the ankle joint, and the inflamed paw edema was evaluated using a plethysmometer. cyclooxygenase (COX)-1 and COX-2 inhibition was evaluated using human recombinant proteins. Multiple applications of SFPP exerted a significant analgesic effect from the first day of application as compared to the other NSAID patches. In terms of paw edema, SFPP decreased edema from the second day after application, Multiple applications of SFPP were superior to those of other NSAID patches, in terms of the analgesic effect with multiple applications. These results suggest that SFPP may be a beneficial patch for providing analgesic and anti-inflammatory effects clinically. Drug Dev Res 77 : 206-211, 2016. © 2016 The Authors Drug Development Research Published by Wiley Periodicals, Inc. © 2016 The Authors Drug Development Research Published by Wiley Periodicals, Inc.
Toda, Yoshihisa; Hori, Miyuki; Mitani, Akiko; Ichihara, Takahiro; Sekine, Shingo; Kaku, Shinsuke; Otsuka, Noboru; Matsumoto, Hideo
2016-01-01
Abstract Preclinical Research The aim of this study was to evaluate the efficacy of multiple applications of S(+)‐flurbiprofen plaster (SFPP), a novel Nonsteroidal anti‐inflammatory drug (NSAID) patch, for the alleviation of inflammatory pain and edema in rat adjuvant‐induced arthritis (AIA) model as compared to other NSAID patches. The AIA model was induced by the injection of Mycobacterium butyricum and rats were treated with a patch (1.0 cm × 0.88 cm) containing each NSAID (SFP, ketoprofen, loxoprofen, diclofenac, felbinac, flurbiprofen, or indomethacin) applied to the paw for 6 h per day for 5 days. The pain threshold was evaluated using a flexion test of the ankle joint, and the inflamed paw edema was evaluated using a plethysmometer. cyclooxygenase (COX)−1 and COX‐2 inhibition was evaluated using human recombinant proteins. Multiple applications of SFPP exerted a significant analgesic effect from the first day of application as compared to the other NSAID patches. In terms of paw edema, SFPP decreased edema from the second day after application, Multiple applications of SFPP were superior to those of other NSAID patches, in terms of the analgesic effect with multiple applications. These results suggest that SFPP may be a beneficial patch for providing analgesic and anti‐inflammatory effects clinically. Drug Dev Res 77 : 206–211, 2016. © 2016 The Authors Drug Development Research Published by Wiley Periodicals, Inc. PMID:27241582
Application of long-term simulation programs for analysis of system islanding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sancha, J.L.; Llorens, M.L.; Moreno, J.M.
1997-02-01
This paper describes the main results and conclusions from the application of two different long-term stability programs to the analysis of a system islanding scenario for a study case developed by Red Electrica de Espana (REE), based on the Spanish system. Two main goals were to evaluate the performance of both the influence of some important control and protection elements (tie-line loss-of-synchronism relays, underfrequency load-shedding, load-frequency control, and power plant dynamics). Conclusions about modeling and computational requirements for system islanding (frequency) scenarios and use of long-term stability programs are presented.
Creating and indexing teaching files from free-text patient reports.
Johnson, D. B.; Chu, W. W.; Dionisio, J. D.; Taira, R. K.; Kangarloo, H.
1999-01-01
Teaching files based on real patient data can enhance the education of students, staff and other colleagues. Although information retrieval system can index free-text documents using keywords, these systems do not work well where content bearing terms (e.g., anatomy descriptions) frequently appears. This paper describes a system that uses multi-word indexing terms to provide access to free-text patient reports. The utilization of multi-word indexing allows better modeling of the content of medical reports, thus improving retrieval performance. The method used to select indexing terms as well as early evaluation of retrieval performance is discussed. PMID:10566473
NASA Astrophysics Data System (ADS)
Steeneveld, G. J.; Tolk, L. F.; Moene, A. F.; Hartogensis, O. K.; Peters, W.; Holtslag, A. A. M.
2011-12-01
The Weather Research and Forecasting Model (WRF) and the Regional Atmospheric Mesoscale Model System (RAMS) are frequently used for (regional) weather, climate and air quality studies. This paper covers an evaluation of these models for a windy and calm episode against Cabauw tower observations (Netherlands), with a special focus on the representation of the physical processes in the atmospheric boundary layer (ABL). In addition, area averaged sensible heat flux observations by scintillometry are utilized which enables evaluation of grid scale model fluxes and flux observations at the same horizontal scale. Also, novel ABL height observations by ceilometry and of the near surface longwave radiation divergence are utilized. It appears that WRF in its basic set-up shows satisfactory model results for nearly all atmospheric near surface variables compared to field observations, while RAMS needed refining of its ABL scheme. An important inconsistency was found regarding the ABL daytime heat budget: Both model versions are only able to correctly forecast the ABL thermodynamic structure when the modeled surface sensible heat flux is much larger than both the eddy-covariance and scintillometer observations indicate. In order to clarify this discrepancy, model results for each term of the heat budget equation is evaluated against field observations. Sensitivity studies and evaluation of radiative tendencies and entrainment reveal that possible errors in these variables cannot explain the overestimation of the sensible heat flux within the current model infrastructure.
Sustainability and durability analysis of reinforced concrete structures
NASA Astrophysics Data System (ADS)
Horáková, A.; Broukalová, I.; Kohoutková, A.; Vašková, J.
2017-09-01
The article describes an assessment of reinforced concrete structures in terms of durability and sustainable development. There is a short summary of findings from the literature on evaluation methods for environmental impacts and also about corrosive influences acting on the reinforced concrete structure, about factors influencing the durability of these structures and mathematical models describing the corrosion impacts. Variant design of reinforced concrete structure and assessment of these variants in terms of durability and sustainability was performed. The analysed structure was a concrete ceiling structure of a parking house for cars. The variants differ in strength class of concrete and thickness of concrete slab. It was found that in terms of durability and sustainable development it is significantly preferable to use higher class of concrete. There are significant differences in results of concrete structures durability for different mathematical models of corrosive influences.
Male sexual strategies modify ratings of female models with specific waist-to-hip ratios.
Brase, Gary L; Walker, Gary
2004-06-01
Female waist-to-hip ratio (WHR) has generally been an important general predictor of ratings of physical attractiveness and related characteristics. Individual differences in ratings do exist, however, and may be related to differences in the reproductive tactics of the male raters such as pursuit of short-term or long-term relationships and adjustments based on perceptions of one's own quality as a mate. Forty males, categorized according to sociosexual orientation and physical qualities (WHR, Body Mass Index, and self-rated desirability), rated female models on both attractiveness and likelihood they would approach them. Sociosexually restricted males were less likely to approach females rated as most attractive (with 0.68-0.72 WHR), as compared with unrestricted males. Males with lower scores in terms of physical qualities gave ratings indicating more favorable evaluations of female models with lower WHR. The results indicate that attractiveness and willingness to approach are overlapping but distinguishable constructs, both of which are influenced by variations in characteristics of the raters.
Race, Amos; Miller, Mark A; Mann, Kenneth A
2008-10-20
Pre-clinical screening of cemented implant systems could be improved by modeling the longer-term response of the implant/cement/bone construct to cyclic loading. We formulated bone cement with degraded fatigue fracture properties (Sub-cement) such that long-term fatigue could be simulated in short-term cadaver tests. Sub-cement was made by adding a chain-transfer agent to standard polymethylmethacrylate (PMMA) cement. This reduced the molecular weight of the inter-bead matrix without changing reaction-rate or handling characteristics. Static mechanical properties were approximately equivalent to normal cement. Over a physiologically reasonable range of stress-intensity factor, fatigue crack propagation rates for Sub-cement were higher by a factor of 25+/-19. When tested in a simplified 2 1/2-D physical model of a stem-cement-bone system, crack growth from the stem was accelerated by a factor of 100. Sub-cement accelerated both crack initiation and growth rate. Sub-cement is now being evaluated in full stem/cement/femur models.
NASA Astrophysics Data System (ADS)
Brook, Anna; Wittenberg, Lea
2015-04-01
Long-term environmental monitoring is addressed to identify physical and biological changes and progresses taking place in the ecosystem. This basic action of landscape monitoring is an essential part of the systematic long-term surveillance, aiming to evaluate, assess and predict the spatial change and progresses. Indeed, it provides a context for wide range of diverse studies and research frameworks from regional or global scale. Spatial-temporal trends and changes at various scales (massive to less certain) require establishing consistent baseline data over time. One of the spatial cases of landscape monitoring is dedicated to soil formation and pedological progresses. It is previously acknowledged that changes in soil affect the functionality of the environment, so monitoring changes recently become important cause considerable resources in areas such as environmental management, sustainability services, and protecting the environment healthy. Given the above, it can be concluded that monitoring changes in the base for sustainable development. The hydrological response of bare soils and watersheds in semiarid regions to intense rainfall events is known to be complex due to multiply physical and structural impacts and feedbacks. As a result, the comprehensive evaluations of mathematical models including detailed consideration of uncertainties in the modeling of hydrological and environmental systems are of increasing importance. The presented method incorporates means of remote sensing data, hydrological and climate data and implementing dedicated and integrative Monte Carlo Analysis Toolbox (MCAT) model for semiarid region. Complexity of practical models to represent spatial systems requires an extensive understanding of the spatial phenomena, while providing realistic balance of sensitivity and corresponding uncertainty levels. Nowadays a large number of dedicated mathematical models applied to assess environmental hydrological process. Among the most promising models is the MCAT, which is a MATLAB library of visual and numerical analysis tools for the evaluation of hydrological and environmental models. The model applied in this paper presents an innovative infrastructural system for predicting soil stability and erosion impacts. This integrated model is applicable to mixed areas with spatially varying soil properties, landscape, and land-cover characteristics. Data from a semiarid site in southern Israel was used to evaluate the model and analyze fundamental erosion mechanisms. The findings estimate the sensitivity of the suggested model to the physical parameters and encourage the use of hyperspectral remote sensing imagery (HSI). The proposed model is integrated according to the following stages: 1. The soil texture, aggregation, soil moisture estimated via airborne HSI data, including soil surface clay and calcium carbonate erosions; 2. The mechanical stability of soil assessed via pedo-transfer function corresponding to load dependent changes in soil physical properties due to pre-compression stress (set of equations study shear strength parameters take into account soil texture, aggregation, soil moisture and ecological soil variables); 3. The precipitation-related runoff model program (RMP) satisfactorily reproduces the observed seasonal mean and variation of surface runoff for the current climate simulation; 4. The Monte Carlo Analysis Toolbox (MCAT), a library of visual and numerical analysis tools for the evaluation of hydrological and environmental models, is proposed as a tool for integrate all the approaches to an applicable model. The presented model overcomes the limitations of existing modeling methods by integrating physical data produced via HSI and yet stays generic in terms of space and time independency.
Reliability of Modern Scores to Predict Long-Term Mortality After Isolated Aortic Valve Operations.
Barili, Fabio; Pacini, Davide; D'Ovidio, Mariangela; Ventura, Martina; Alamanni, Francesco; Di Bartolomeo, Roberto; Grossi, Claudio; Davoli, Marina; Fusco, Danilo; Perucci, Carlo; Parolari, Alessandro
2016-02-01
Contemporary scores for estimating perioperative death have been proposed to also predict also long-term death. The aim of the study was to evaluate the performance of the updated European System for Cardiac Operative Risk Evaluation II, The Society of Thoracic Surgeons Predicted Risk of Mortality score, and the Age, Creatinine, Left Ventricular Ejection Fraction score for predicting long-term mortality in a contemporary cohort of isolated aortic valve replacement (AVR). We also sought to develop for each score a simple algorithm based on predicted perioperative risk to predict long-term survival. Complete data on 1,444 patients who underwent isolated AVR in a 7-year period were retrieved from three prospective institutional databases and linked with the Italian Tax Register Information System. Data were evaluated with performance analyses and time-to-event semiparametric regression. Survival was 83.0% ± 1.1% at 5 years and 67.8 ± 1.9% at 8 years. Discrimination and calibration of all three scores both worsened for prediction of death at 1 year and 5 years. Nonetheless, a significant relationship was found between long-term survival and quartiles of scores (p < 0.0001). The estimated perioperative risk by each model was used to develop an algorithm to predict long-term death. The hazard ratios for death were 1.1 (95% confidence interval, 1.07 to 1.12) for European System for Cardiac Operative Risk Evaluation II, 1.34 (95% CI, 1.28 to 1.40) for the Society of Thoracic Surgeons score, and 1.08 (95% CI, 1.06 to 1.10) for the Age, Creatinine, Left Ventricular Ejection Fraction score. The predicted risk generated by European System for Cardiac Operative Risk Evaluation II, The Society of Thoracic Surgeons score, and Age, Creatinine, Left Ventricular Ejection Fraction scores cannot also be considered a direct estimate of the long-term risk for death. Nonetheless, the three scores can be used to derive an estimate of long-term risk of death in patients who undergo isolated AVR with the use of a simple algorithm. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Strategies of Intervention with Public Offenders.
ERIC Educational Resources Information Center
Chaneles, Sol, Ed.
1981-01-01
Reviews intervention strategies with public offenders, including learning therapy, education, group assertive training, and the use of volunteers. The l0 articles deal with inmates' rights in terms of health care and psychotherapy, and evaluation of social programs, and a psychodrama program description/model. (JAC)
Thiel, Rainer; Viceconti, Marco; Stroetmann, Karl
2011-01-01
Biocomputational modelling as developed by the European Virtual Physiological Human (VPH) Initiative is the area of ICT most likely to revolutionise in the longer term the practice of medicine. Using the example of osteoporosis management, a socio-economic assessment framework is presented that captures how the transformation of clinical guidelines through VPH models can be evaluated. Applied to the Osteoporotic Virtual Physiological Human Project, a consequent benefit-cost analysis delivers promising results, both methodologically and substantially.
Strategic planning: a biomedical communications model.
Barrett, J E
1991-01-01
This article describes a biomedical communications approach to strategic planning. This model produces a short-term plan that allows a department to take the competitive advantage, react to technological change, and make timely decisions on new courses of action. The model calls for self-study, involving staff in brainstorming sessions where options are identified and ideas are prioritized into possible strategies for success. The article recommends that an evaluation and monitoring schedule be implemented after decisions have been made.
Modeling Composite Assessment Data Using Item Response Theory
Ueckert, Sebastian
2018-01-01
Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119
Fabian Uzoh; William W. Oliver
2006-01-01
A height increment model is developed and evaluated for individual trees of ponderosa pine throughout the species range in western United States. The data set used in this study came from long-term permanent research plots in even-aged, pure stands both planted and of natural origin. The data base consists of six levels-of-growing stock studies supplemented by initial...
Science and Technology Investment Strategy for Squadron Level Training
1993-05-01
be derived from empirically sound and theory -based instructional models. Cmment. The automation of instructional design could favorably impact the...require a significant amount of time to develop and where the underlying theory and/or applications hardware and software is ht flux. Long-term efforts...training or training courses. It does not refer to the initial evaluation of individuals entering Upgrade Training ( UGT ). It Am refer to the evaluation of
ERIC Educational Resources Information Center
Wang, Yan; Rodríguez de Gil, Patricia; Chen, Yi-Hsin; Kromrey, Jeffrey D.; Kim, Eun Sook; Pham, Thanh; Nguyen, Diep; Romano, Jeanine L.
2017-01-01
Various tests to check the homogeneity of variance assumption have been proposed in the literature, yet there is no consensus as to their robustness when the assumption of normality does not hold. This simulation study evaluated the performance of 14 tests for the homogeneity of variance assumption in one-way ANOVA models in terms of Type I error…
Samuel A. Cushman; Kevin S. McKelvey
2006-01-01
The primary weakness in our current ability to evaluate future landscapes in terms of wildlife lies in the lack of quantitative models linking wildlife to forest stand conditions, including fuels treatments. This project focuses on 1) developing statistical wildlife habitat relationships models (WHR) utilizing Forest Inventory and Analysis (FIA) and National Vegetation...
ERIC Educational Resources Information Center
Ozdemir, Oguzhan; Erdemci, Husamettin
2017-01-01
The term mobile portfolio refers to creating, evaluating and sharing portfolios in mobile environments. Many of the states that pose an obstacle for portfolio usage are now extinguished through mobile portfolios. The aim in this research is to determine the effect of mobile portfolio supported mastery learning model on students' success and…
NASA Astrophysics Data System (ADS)
Pemberton, Per; Löptien, Ulrike; Hordoir, Robinson; Höglund, Anders; Schimanke, Semjon; Axell, Lars; Haapala, Jari
2017-08-01
The Baltic Sea is a seasonally ice-covered marginal sea in northern Europe with intense wintertime ship traffic and a sensitive ecosystem. Understanding and modeling the evolution of the sea-ice pack is important for climate effect studies and forecasting purposes. Here we present and evaluate the sea-ice component of a new NEMO-LIM3.6-based ocean-sea-ice setup for the North Sea and Baltic Sea region (NEMO-Nordic). The setup includes a new depth-based fast-ice parametrization for the Baltic Sea. The evaluation focuses on long-term statistics, from a 45-year long hindcast, although short-term daily performance is also briefly evaluated. We show that NEMO-Nordic is well suited for simulating the mean sea-ice extent, concentration, and thickness as compared to the best available observational data set. The variability of the annual maximum Baltic Sea ice extent is well in line with the observations, but the 1961-2006 trend is underestimated. Capturing the correct ice thickness distribution is more challenging. Based on the simulated ice thickness distribution we estimate the undeformed and deformed ice thickness and concentration in the Baltic Sea, which compares reasonably well with observations.