Sample records for model evaluation process

  1. Study on process evaluation model of students' learning in practical course

    NASA Astrophysics Data System (ADS)

    Huang, Jie; Liang, Pei; Shen, Wei-min; Ye, Youxiang

    2017-08-01

    In practical course teaching based on project object method, the traditional evaluation methods include class attendance, assignments and exams fails to give incentives to undergraduate students to learn innovatively and autonomously. In this paper, the element such as creative innovation, teamwork, document and reporting were put into process evaluation methods, and a process evaluation model was set up. Educational practice shows that the evaluation model makes process evaluation of students' learning more comprehensive, accurate, and fairly.

  2. The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions

    PubMed Central

    Fridrich, Annemarie; Jenny, Gregor J.; Bauer, Georg F.

    2015-01-01

    To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results. PMID:26557665

  3. The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions.

    PubMed

    Fridrich, Annemarie; Jenny, Gregor J; Bauer, Georg F

    2015-01-01

    To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results.

  4. Evaluation of Models of the Reading Process.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  5. Modeling the dynamics of evaluation: a multilevel neural network implementation of the iterative reprocessing model.

    PubMed

    Ehret, Phillip J; Monroe, Brian M; Read, Stephen J

    2015-05-01

    We present a neural network implementation of central components of the iterative reprocessing (IR) model. The IR model argues that the evaluation of social stimuli (attitudes, stereotypes) is the result of the IR of stimuli in a hierarchy of neural systems: The evaluation of social stimuli develops and changes over processing. The network has a multilevel, bidirectional feedback evaluation system that integrates initial perceptual processing and later developing semantic processing. The network processes stimuli (e.g., an individual's appearance) over repeated iterations, with increasingly higher levels of semantic processing over time. As a result, the network's evaluations of stimuli evolve. We discuss the implications of the network for a number of different issues involved in attitudes and social evaluation. The success of the network supports the IR model framework and provides new insights into attitude theory. © 2014 by the Society for Personality and Social Psychology, Inc.

  6. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: "Choice, Control & Change"

    ERIC Educational Resources Information Center

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2013-01-01

    Objective: To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, "Choice, Control & Change", designed to promote dietary and physical activity behaviors that reduce obesity risk. Design: A process evaluation study based on a systematic conceptual model. Setting: Five…

  7. The Strengths and Limitations of Satellite Data for Evaluating Tropospheric Processes in Chemistry-Climate Models

    NASA Technical Reports Server (NTRS)

    Duncan, Bryan

    2012-01-01

    There is now a wealth of satellite data products available with which to evaluate a model fs simulation of tropospheric composition and other model processes. All of these data products have their strengths and limitations that need to be considered for this purpose. For example, uncertainties are introduced into a data product when 1) converting a slant column to a vertical column and 2) estimating the amount of a total column of a trace gas (e.g., ozone, nitrogen dioxide) that resides in the troposphere. Oftentimes, these uncertainties are not well quantified and the satellite data products are not well evaluated against in situ observations. However, these limitations do not preclude us from using these data products to evaluate our model processes if we understand these strengths and limitations when developing diagnostics. I will show several examples of how satellite data products are being used to evaluate particular model processes with a focus on the strengths and limitations of these data products. In addition, I will introduce the goals of a newly formed team to address issues on the topic of "satellite data for improved model evaluation and process studies" that is established in support of the IGAC/SPARC Global Chemistry ]Climate Modeling and Evaluation Workshop.

  8. An Analytical Hierarchy Process Model for the Evaluation of College Experimental Teaching Quality

    ERIC Educational Resources Information Center

    Yin, Qingli

    2013-01-01

    Taking into account the characteristics of college experimental teaching, through investigaton and analysis, evaluation indices and an Analytical Hierarchy Process (AHP) model of experimental teaching quality have been established following the analytical hierarchy process method, and the evaluation indices have been given reasonable weights. An…

  9. A Model for Evaluating Development Programs. Miscellaneous Report.

    ERIC Educational Resources Information Center

    Burton, John E., Jr.; Rogers, David L.

    Taking the position that the Classical Experimental Evaluation (CEE) Model does not do justice to the process of acquiring information necessary for decision making re planning, programming, implementing, and recycling program activities, this paper presents the Inductive, System-Process (ISP) evaluation model as an alternative to be used in…

  10. Evaluating Process Improvement Courses of Action Through Modeling and Simulation

    DTIC Science & Technology

    2017-09-16

    changes to a process is time consuming and has potential to overlook stochastic effects. By modeling a process as a Numerical Design Structure Matrix...13 Methods to Evaluate Process Performance ................................................................15 The Design Structure...Matrix ......................................................................................16 Numerical Design Structure Matrix

  11. Using RUFDATA to guide a logic model for a quality assurance process in an undergraduate university program.

    PubMed

    Sherman, Paul David

    2016-04-01

    This article presents a framework to identify key mechanisms for developing a logic model blueprint that can be used for an impending comprehensive evaluation of an undergraduate degree program in a Canadian university. The evaluation is a requirement of a comprehensive quality assurance process mandated by the university. A modified RUFDATA (Saunders, 2000) evaluation model is applied as an initiating framework to assist in decision making to provide a guide for conceptualizing a logic model for the quality assurance process. This article will show how an educational evaluation is strengthened by employing a RUFDATA reflective process in exploring key elements of the evaluation process, and then translating this information into a logic model format that could serve to offer a more focussed pathway for the quality assurance activities. Using preliminary program evaluation data from two key stakeholders of the undergraduate program as well as an audit of the curriculum's course syllabi, a case is made for, (1) the importance of inclusivity of key stakeholders participation in the design of the evaluation process to enrich the authenticity and accuracy of program participants' feedback, and (2) the diversification of data collection methods to ensure that stakeholders' narrative feedback is given ample exposure. It is suggested that the modified RUFDATA/logic model framework be applied to all academic programs at the university undergoing the quality assurance process at the same time so that economies of scale may be realized. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  13. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.

  14. Evaluation of animal models of neurobehavioral disorders

    PubMed Central

    van der Staay, F Josef; Arndt, Saskia S; Nordquist, Rebecca E

    2009-01-01

    Animal models play a central role in all areas of biomedical research. The process of animal model building, development and evaluation has rarely been addressed systematically, despite the long history of using animal models in the investigation of neuropsychiatric disorders and behavioral dysfunctions. An iterative, multi-stage trajectory for developing animal models and assessing their quality is proposed. The process starts with defining the purpose(s) of the model, preferentially based on hypotheses about brain-behavior relationships. Then, the model is developed and tested. The evaluation of the model takes scientific and ethical criteria into consideration. Model development requires a multidisciplinary approach. Preclinical and clinical experts should establish a set of scientific criteria, which a model must meet. The scientific evaluation consists of assessing the replicability/reliability, predictive, construct and external validity/generalizability, and relevance of the model. We emphasize the role of (systematic and extended) replications in the course of the validation process. One may apply a multiple-tiered 'replication battery' to estimate the reliability/replicability, validity, and generalizability of result. Compromised welfare is inherent in many deficiency models in animals. Unfortunately, 'animal welfare' is a vaguely defined concept, making it difficult to establish exact evaluation criteria. Weighing the animal's welfare and considerations as to whether action is indicated to reduce the discomfort must accompany the scientific evaluation at any stage of the model building and evaluation process. Animal model building should be discontinued if the model does not meet the preset scientific criteria, or when animal welfare is severely compromised. The application of the evaluation procedure is exemplified using the rat with neonatal hippocampal lesion as a proposed model of schizophrenia. In a manner congruent to that for improving animal models, guided by the procedure expounded upon in this paper, the developmental and evaluation procedure itself may be improved by careful definition of the purpose(s) of a model and by defining better evaluation criteria, based on the proposed use of the model. PMID:19243583

  15. Towards Systematic Benchmarking of Climate Model Performance

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine performance tests readily accessible will help advance a more transparent model evaluation process.

  16. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    PubMed

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  17. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  18. Development of evaluation models of manpower needs for dismantling the dry conversion process-related equipment in uranium refining and conversion plant (URCP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sari Izumo; Hideo Usui; Mitsuo Tachibana

    Evaluation models for determining the manpower needs for dismantling various types of equipment in uranium refining and conversion plant (URCP) have been developed. The models are widely applicable to other uranium handling facilities. Additionally, a simplified model was developed for easily and accurately calculating the manpower needs for dismantling dry conversion process-related equipment (DP equipment). It is important to evaluate beforehand project management data such as manpower needs to prepare an optimized decommissioning plan and implement effective dismantling activity. The Japan Atomic Energy Agency (JAEA) has developed the project management data evaluation system for dismantling activities (PRODIA code), which canmore » generate project management data using evaluation models. For preparing an optimized decommissioning plan, these evaluation models should be established based on the type of nuclear facility and actual dismantling data. In URCP, the dry conversion process of reprocessed uranium and others was operated until 1999, and the equipment related to the main process was dismantled from 2008 to 2011. Actual data such as manpower for dismantling were collected during the dismantling activities, and evaluation models were developed using the collected actual data on the basis of equipment classification considering the characteristics of uranium handling facility. (authors)« less

  19. Semi-Markov adjunction to the Computer-Aided Markov Evaluator (CAME)

    NASA Technical Reports Server (NTRS)

    Rosch, Gene; Hutchins, Monica A.; Leong, Frank J.; Babcock, Philip S., IV

    1988-01-01

    The rule-based Computer-Aided Markov Evaluator (CAME) program was expanded in its ability to incorporate the effect of fault-handling processes into the construction of a reliability model. The fault-handling processes are modeled as semi-Markov events and CAME constructs and appropriate semi-Markov model. To solve the model, the program outputs it in a form which can be directly solved with the Semi-Markov Unreliability Range Evaluator (SURE) program. As a means of evaluating the alterations made to the CAME program, the program is used to model the reliability of portions of the Integrated Airframe/Propulsion Control System Architecture (IAPSA 2) reference configuration. The reliability predictions are compared with a previous analysis. The results bear out the feasibility of utilizing CAME to generate appropriate semi-Markov models to model fault-handling processes.

  20. Evaluate styrene production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagopian, C.R.; Lewis, P.J.; McDonald, J.J.

    1983-02-01

    Improvements and innovations in styrene production since 1966 are outlined. Rigorous process models are attributed to the changes. Such models are used to evaluate the effects of changing raw material costs, utility costs, and available catalyst choices. The process model can also evaluate the best operating configuration and catalyst choice for a plant. All specified innovations are incorporated in the Mobil/Badger ethylbenzene and the Cosden/Badger styrene processes (both of which are schematicized). Badger's training programs are reviewed. Badger's Styrenics Business Team converts information into plant design basis. A reaction model with input derived from isothermal and adiabatic pilot plant unitsmore » is at the heart of complete computer simulation of ethylbenzene and styrene processes.« less

  1. The Spiral-Interactive Program Evaluation Model.

    ERIC Educational Resources Information Center

    Khaleel, Ibrahim Adamu

    1988-01-01

    Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…

  2. Validating the ACE Model for Evaluating Student Performance Using a Teaching-Learning Process Based on Computational Modeling Systems

    ERIC Educational Resources Information Center

    Louzada, Alexandre Neves; Elia, Marcos da Fonseca; Sampaio, Fábio Ferrentini; Vidal, Andre Luiz Pestana

    2014-01-01

    The aim of this work is to adapt and test, in a Brazilian public school, the ACE model proposed by Borkulo for evaluating student performance as a teaching-learning process based on computational modeling systems. The ACE model is based on different types of reasoning involving three dimensions. In addition to adapting the model and introducing…

  3. A Participatory Action Research Approach To Evaluating Inclusive School Programs.

    ERIC Educational Resources Information Center

    Dymond, Stacy K.

    2001-01-01

    This article proposes a model for evaluating inclusive schools. Key elements of the model are inclusion of stakeholders in the evaluation process through a participatory action research approach, analysis of program processes and outcomes, use of multiple methods and measures, and obtaining perceptions from diverse stakeholder groups. (Contains…

  4. Tools for evaluating Veterinary Services: an external auditing model for the quality assurance process.

    PubMed

    Melo, E Correa

    2003-08-01

    The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.

  5. Online Deviation Detection for Medical Processes

    PubMed Central

    Christov, Stefan C.; Avrunin, George S.; Clarke, Lori A.

    2014-01-01

    Human errors are a major concern in many medical processes. To help address this problem, we are investigating an approach for automatically detecting when performers of a medical process deviate from the acceptable ways of performing that process as specified by a detailed process model. Such deviations could represent errors and, thus, detecting and reporting deviations as they occur could help catch errors before harm is done. In this paper, we identify important issues related to the feasibility of the proposed approach and empirically evaluate the approach for two medical procedures, chemotherapy and blood transfusion. For the evaluation, we use the process models to generate sample process executions that we then seed with synthetic errors. The process models describe the coordination of activities of different process performers in normal, as well as in exceptional situations. The evaluation results suggest that the proposed approach could be applied in clinical settings to help catch errors before harm is done. PMID:25954343

  6. A Java-based fMRI processing pipeline evaluation system for assessment of univariate general linear model and multivariate canonical variate analysis-based pipelines.

    PubMed

    Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C

    2008-01-01

    As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.

  7. CALIBRATION OF SUBSURFACE BATCH AND REACTIVE-TRANSPORT MODELS INVOLVING COMPLEX BIOGEOCHEMICAL PROCESSES

    EPA Science Inventory

    In this study, the calibration of subsurface batch and reactive-transport models involving complex biogeochemical processes was systematically evaluated. Two hypothetical nitrate biodegradation scenarios were developed and simulated in numerical experiments to evaluate the perfor...

  8. Prioritization of engineering support requests and advanced technology projects using decision support and industrial engineering models

    NASA Technical Reports Server (NTRS)

    Tavana, Madjid

    1995-01-01

    The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.

  9. A Model for the Evaluation of Educational Products.

    ERIC Educational Resources Information Center

    Bertram, Charles L.

    A model for the evaluation of educational products based on experience with development of three such products is described. The purpose of the evaluation model is to indicate the flow of evaluation activity as products undergo development. Evaluation is given Stufflebeam's definition as the process of delineating, obtaining, and providing useful…

  10. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  11. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: Choice, Control & Change

    PubMed Central

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2012-01-01

    Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021

  12. Using a systematic conceptual model for a process evaluation of a middle school obesity risk-reduction nutrition curriculum intervention: choice, control & change.

    PubMed

    Lee, Heewon; Contento, Isobel R; Koch, Pamela

    2013-03-01

    To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. A process evaluation study based on a systematic conceptual model. Five middle schools in New York City. Five hundred sixty-two students in 20 classes and their science teachers (n = 8). Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers' curriculum evaluation, and satisfaction with teaching the curriculum. Descriptive statistics and Spearman ρ correlation for quantitative analysis and content analysis for qualitative data were used. Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and the student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teacher satisfaction with teaching the curriculum was highly correlated with student satisfaction (P < .05). Teacher perception of amount of student work was negatively correlated with implementation and with student satisfaction (P < .05). Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  13. Decisionmaking Context Model for Enhancing Evaluation Utilization.

    ERIC Educational Resources Information Center

    Brown, Robert D.; And Others

    1984-01-01

    This paper discusses two models that hold promise for helping evaluators understand and cope with different decision contexts: (1) the conflict Model (Janis and Mann, 1977) and the Social Process Model (Vroom and Yago, 1974). Implications and guidelines for using decisionmaking models in evaluation settings are presented. (BS)

  14. Undergraduate medical education programme renewal: a longitudinal context, input, process and product evaluation study.

    PubMed

    Mirzazadeh, Azim; Gandomkar, Roghayeh; Hejri, Sara Mortaz; Hassanzadeh, Gholamreza; Koochak, Hamid Emadi; Golestani, Abolfazl; Jafarian, Ali; Jalili, Mohammad; Nayeri, Fatemeh; Saleh, Narges; Shahi, Farhad; Razavi, Seyed Hasan Emami

    2016-02-01

    The purpose of this study was to utilize the Context, Input, Process and Product (CIPP) evaluation model as a comprehensive framework to guide initiating, planning, implementing and evaluating a revised undergraduate medical education programme. The eight-year longitudinal evaluation study consisted of four phases compatible with the four components of the CIPP model. In the first phase, we explored the strengths and weaknesses of the traditional programme as well as contextual needs, assets, and resources. For the second phase, we proposed a model for the programme considering contextual features. During the process phase, we provided formative information for revisions and adjustments. Finally, in the fourth phase, we evaluated the outcomes of the new undergraduate medical education programme in the basic sciences phase. Information was collected from different sources such as medical students, faculty members, administrators, and graduates, using various qualitative and quantitative methods including focus groups, questionnaires, and performance measures. The CIPP model has the potential to guide policy makers to systematically collect evaluation data and to manage stakeholders' reactions at each stage of the reform in order to make informed decisions. However, the model may result in evaluation burden and fail to address some unplanned evaluation questions.

  15. 'Healthy Eating and Lifestyle in Pregnancy (HELP)' trial: Process evaluation framework.

    PubMed

    Simpson, Sharon A; Cassidy, Dunla; John, Elinor

    2014-07-01

    We developed and tested in a cluster RCT a theory-driven group-based intervention for obese pregnant women. It was designed to support women to moderate weight gain during pregnancy and reduce BMI one year after birth, in addition to targeting secondary health and wellbeing outcomes. In line with MRC guidance on developing and evaluating complex interventions in health, we conducted a process evaluation alongside the trial. This paper describes the development of the process evaluation framework. This cluster RCT recruited 598 pregnant women. Women in the intervention group were invited to attend a weekly weight-management group. Following a review of relevant literature, we developed a process evaluation framework which outlined key process indicators that we wanted to address and how we would measure these. Central to the process evaluation was to understand the mechanism of effect of the intervention. We utilised a logic-modelling approach to describe the intervention which helped us focus on what potential mediators of intervention effect to measure, and how. The resulting process evaluation framework was designed to address 9 core elements; context, reach, exposure, recruitment, fidelity, recruitment, retention, contamination and theory-testing. These were assessed using a variety of qualitative and quantitative approaches. The logic model explained the processes by which intervention components bring about change in target outcomes through various mediators and theoretical pathways including self-efficacy, social support, self-regulation and motivation. Process evaluation is a key element in assessing the effect of any RCT. We developed a process evaluation framework and logic model, and the results of analyses using these will offer insights into why the intervention is or is not effective. Copyright © 2014.

  16. Pitfalls in the Evaluation of Teachers by Principals.

    ERIC Educational Resources Information Center

    Natriello, Gary; Dornbusch, Sanford M.

    1980-01-01

    Presents the findings of several studies of evaluation processes and identifies a model that helps to make explicit the components of the evaluation process. Suggests rules of thumb for conducting successful evaluations. (Author/JM)

  17. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  18. Histopathological Evaluation of Skeletal Muscle with Specific Reference to Mouse Models of Muscular Dystrophy.

    PubMed

    Terry, Rebecca L; Wells, Dominic J

    2016-12-01

    The muscular dystrophies are a diverse group of degenerative diseases for which many mouse models are available. These models are frequently used to assess potential therapeutic interventions and histological evaluation of multiple muscles is an important part of this assessment. Histological evaluation is especially useful when combined with tests of muscle function. This unit describes a protocol for necropsy, processing, cryosectioning, and histopathological evaluation of murine skeletal muscles, which is applicable to both models of muscular dystrophy and other neuromuscular conditions. Key histopathological features of dystrophic muscle are discussed using the mdx mouse (a model of Duchenne muscular dystrophy) as an example. Optimal handling during dissection, processing and sectioning is vital to avoid artifacts that can confound or prevent future analyses. Muscles carefully processed using this protocol are suitable for further evaluation using immunohistochemistry, immunofluorescence, special histochemical stains, and immuoblotting. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  19. An Evaluation System for the Online Training Programs in Meteorology and Hydrology

    ERIC Educational Resources Information Center

    Wang, Yong; Zhi, Xiefei

    2009-01-01

    This paper studies the current evaluation system for the online training program in meteorology and hydrology. CIPP model that includes context evaluation, input evaluation, process evaluation and product evaluation differs from Kirkpatrick model including reactions evaluation, learning evaluation, transfer evaluation and results evaluation in…

  20. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  1. Evaluating the compatibility of multi-functional and intensive urban land uses

    NASA Astrophysics Data System (ADS)

    Taleai, M.; Sharifi, A.; Sliuzas, R.; Mesgari, M.

    2007-12-01

    This research is aimed at developing a model for assessing land use compatibility in densely built-up urban areas. In this process, a new model was developed through the combination of a suite of existing methods and tools: geographical information system, Delphi methods and spatial decision support tools: namely multi-criteria evaluation analysis, analytical hierarchy process and ordered weighted average method. The developed model has the potential to calculate land use compatibility in both horizontal and vertical directions. Furthermore, the compatibility between the use of each floor in a building and its neighboring land uses can be evaluated. The method was tested in a built-up urban area located in Tehran, the capital city of Iran. The results show that the model is robust in clarifying different levels of physical compatibility between neighboring land uses. This paper describes the various steps and processes of developing the proposed land use compatibility evaluation model (CEM).

  2. Results from the VALUE perfect predictor experiment: process-based evaluation

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Soares, Pedro; Hertig, Elke; Brands, Swen; Huth, Radan; Cardoso, Rita; Kotlarski, Sven; Casado, Maria; Pongracz, Rita; Bartholy, Judit

    2016-04-01

    Until recently, the evaluation of downscaled climate model simulations has typically been limited to surface climatologies, including long term means, spatial variability and extremes. But these aspects are often, at least partly, tuned in regional climate models to match observed climate. The tuning issue is of course particularly relevant for bias corrected regional climate models. In general, a good performance of a model for these aspects in present climate does therefore not imply a good performance in simulating climate change. It is now widely accepted that, to increase our condidence in climate change simulations, it is necessary to evaluate how climate models simulate relevant underlying processes. In other words, it is important to assess whether downscaling does the right for the right reason. Therefore, VALUE has carried out a broad process-based evaluation study based on its perfect predictor experiment simulations: the downscaling methods are driven by ERA-Interim data over the period 1979-2008, reference observations are given by a network of 85 meteorological stations covering all European climates. More than 30 methods participated in the evaluation. In order to compare statistical and dynamical methods, only variables provided by both types of approaches could be considered. This limited the analysis to conditioning local surface variables on variables from driving processes that are simulated by ERA-Interim. We considered the following types of processes: at the continental scale, we evaluated the performance of downscaling methods for positive and negative North Atlantic Oscillation, Atlantic ridge and blocking situations. At synoptic scales, we considered Lamb weather types for selected European regions such as Scandinavia, the United Kingdom, the Iberian Pensinsula or the Alps. At regional scales we considered phenomena such as the Mistral, the Bora or the Iberian coastal jet. Such process-based evaluation helps to attribute biases in surface variables to underlying processes and ultimately to improve climate models.

  3. Are there two processes in reasoning? The dimensionality of inductive and deductive inferences.

    PubMed

    Stephens, Rachel G; Dunn, John C; Hayes, Brett K

    2018-03-01

    Single-process accounts of reasoning propose that the same cognitive mechanisms underlie inductive and deductive inferences. In contrast, dual-process accounts propose that these inferences depend upon 2 qualitatively different mechanisms. To distinguish between these accounts, we derived a set of single-process and dual-process models based on an overarching signal detection framework. We then used signed difference analysis to test each model against data from an argument evaluation task, in which induction and deduction judgments are elicited for sets of valid and invalid arguments. Three data sets were analyzed: data from Singmann and Klauer (2011), a database of argument evaluation studies, and the results of an experiment designed to test model predictions. Of the large set of testable models, we found that almost all could be rejected, including all 2-dimensional models. The only testable model able to account for all 3 data sets was a model with 1 dimension of argument strength and independent decision criteria for induction and deduction judgments. We conclude that despite the popularity of dual-process accounts, current results from the argument evaluation task are best explained by a single-process account that incorporates separate decision thresholds for inductive and deductive inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Business process architectures: overview, comparison and framework

    NASA Astrophysics Data System (ADS)

    Dijkman, Remco; Vanderfeesten, Irene; Reijers, Hajo A.

    2016-02-01

    With the uptake of business process modelling in practice, the demand grows for guidelines that lead to consistent and integrated collections of process models. The notion of a business process architecture has been explicitly proposed to address this. This paper provides an overview of the prevailing approaches to design a business process architecture. Furthermore, it includes evaluations of the usability and use of the identified approaches. Finally, it presents a framework for business process architecture design that can be used to develop a concrete architecture. The use and usability were evaluated in two ways. First, a survey was conducted among 39 practitioners, in which the opinion of the practitioners on the use and usefulness of the approaches was evaluated. Second, four case studies were conducted, in which process architectures from practice were analysed to determine the approaches or elements of approaches that were used in their design. Both evaluations showed that practitioners have a preference for using approaches that are based on reference models and approaches that are based on the identification of business functions or business objects. At the same time, the evaluations showed that practitioners use these approaches in combination, rather than selecting a single approach.

  5. Global Sensitivity Analysis for Process Identification under Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.

    2015-12-01

    The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.

  6. Applying Constructivist and Objectivist Learning Theories in the Design of a Web-based Course: Implications for Practice.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    2001-01-01

    Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…

  7. A Conceptual Framework Curriculum Evaluation Electrical Engineering Education

    ERIC Educational Resources Information Center

    Imansari, Nurulita; Sutadji, Eddy

    2017-01-01

    This evaluation is a conceptual framework that has been analyzed in the hope that can help research related an evaluation of the curriculum. The Model of evaluation used was CIPPO model. CIPPO Model consists of "context," "input," "process," "product," and "outcomes." On the dimension of the…

  8. An automated process for building reliable and optimal in vitro/in vivo correlation models based on Monte Carlo simulations.

    PubMed

    Sutton, Steven C; Hu, Mingxiu

    2006-05-05

    Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.

  9. A combined disease management and process modeling approach for assessing and improving care processes: a fall management case-study.

    PubMed

    Askari, Marjan; Westerhof, Richard; Eslami, Saied; Medlock, Stephanie; de Rooij, Sophia E; Abu-Hanna, Ameen

    2013-10-01

    To propose a combined disease management and process modeling approach for evaluating and improving care processes, and demonstrate its usability and usefulness in a real-world fall management case study. We identified essential disease management related concepts and mapped them into explicit questions meant to expose areas for improvement in the respective care processes. We applied the disease management oriented questions to a process model of a comprehensive real world fall prevention and treatment program covering primary and secondary care. We relied on interviews and observations to complete the process models, which were captured in UML activity diagrams. A preliminary evaluation of the usability of our approach by gauging the experience of the modeler and an external validator was conducted, and the usefulness of the method was evaluated by gathering feedback from stakeholders at an invitational conference of 75 attendees. The process model of the fall management program was organized around the clinical tasks of case finding, risk profiling, decision making, coordination and interventions. Applying the disease management questions to the process models exposed weaknesses in the process including: absence of program ownership, under-detection of falls in primary care, and lack of efficient communication among stakeholders due to missing awareness about other stakeholders' workflow. The modelers experienced the approach as usable and the attendees of the invitational conference found the analysis results to be valid. The proposed disease management view of process modeling was usable and useful for systematically identifying areas of improvement in a fall management program. Although specifically applied to fall management, we believe our case study is characteristic of various disease management settings, suggesting the wider applicability of the approach. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Green Pea and Garlic Puree Model Food Development for Thermal Pasteurization Process Quality Evaluation.

    PubMed

    Bornhorst, Ellen R; Tang, Juming; Sablani, Shyam S; Barbosa-Cánovas, Gustavo V; Liu, Fang

    2017-07-01

    Development and selection of model foods is a critical part of microwave thermal process development, simulation validation, and optimization. Previously developed model foods for pasteurization process evaluation utilized Maillard reaction products as the time-temperature integrators, which resulted in similar temperature sensitivity among the models. The aim of this research was to develop additional model foods based on different time-temperature integrators, determine their dielectric properties and color change kinetics, and validate the optimal model food in hot water and microwave-assisted pasteurization processes. Color, quantified using a * value, was selected as the time-temperature indicator for green pea and garlic puree model foods. Results showed 915 MHz microwaves had a greater penetration depth into the green pea model food than the garlic. a * value reaction rates for the green pea model were approximately 4 times slower than in the garlic model food; slower reaction rates were preferred for the application of model food in this study, that is quality evaluation for a target process of 90 °C for 10 min at the cold spot. Pasteurization validation used the green pea model food and results showed that there were quantifiable differences between the color of the unheated control, hot water pasteurization, and microwave-assisted thermal pasteurization system. Both model foods developed in this research could be utilized for quality assessment and optimization of various thermal pasteurization processes. © 2017 Institute of Food Technologists®.

  11. Modeling biogechemical reactive transport in a fracture zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molinero, Jorge; Samper, Javier; Yang, Chan Bing, and Zhang, Guoxiang

    2005-01-14

    A coupled model of groundwater flow, reactive solute transport and microbial processes for a fracture zone of the Aspo site at Sweden is presented. This is the model of the so-called Redox Zone Experiment aimed at evaluating the effects of tunnel construction on the geochemical conditions prevailing in a fracture granite. It is found that a model accounting for microbially-mediated geochemical processes is able to reproduce the unexpected measured increasing trends of dissolved sulfate and bicarbonate. The model is also useful for testing hypotheses regarding the role of microbial processes and evaluating the sensitivity of model results to changes inmore » biochemical parameters.« less

  12. A framework for human-hydrologic system model development integrating hydrology and water management: application to the Cutzamala water system in Mexico

    NASA Astrophysics Data System (ADS)

    Wi, S.; Freeman, S.; Brown, C.

    2017-12-01

    This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.

  13. Evaluating energy saving system of data centers based on AHP and fuzzy comprehensive evaluation model

    NASA Astrophysics Data System (ADS)

    Jiang, Yingni

    2018-03-01

    Due to the high energy consumption of communication, energy saving of data centers must be enforced. But the lack of evaluation mechanisms has restrained the process on energy saving construction of data centers. In this paper, energy saving evaluation index system of data centers was constructed on the basis of clarifying the influence factors. Based on the evaluation index system, analytical hierarchy process was used to determine the weights of the evaluation indexes. Subsequently, a three-grade fuzzy comprehensive evaluation model was constructed to evaluate the energy saving system of data centers.

  14. A multi-site cognitive task analysis for biomedical query mediation.

    PubMed

    Hruby, Gregory W; Rasmussen, Luke V; Hanauer, David; Patel, Vimla L; Cimino, James J; Weng, Chunhua

    2016-09-01

    To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: "Identify potential index phenotype," "If needed, request EHR database access rights," and "Perform query and present output to medical researcher", and 8 are invalid. We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. A Multi-Site Cognitive Task Analysis for Biomedical Query Mediation

    PubMed Central

    Hruby, Gregory W.; Rasmussen, Luke V.; Hanauer, David; Patel, Vimla; Cimino, James J.; Weng, Chunhua

    2016-01-01

    Objective To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. Materials and Methods We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. Results The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: “Identify potential index phenotype,” “If needed, request EHR database access rights,” and “Perform query and present output to medical researcher”, and 8 are invalid. Discussion We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. Conclusions We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. PMID:27435950

  16. A dual-process perspective on fluency-based aesthetics: the pleasure-interest model of aesthetic liking.

    PubMed

    Graf, Laura K M; Landwehr, Jan R

    2015-11-01

    In this article, we develop an account of how aesthetic preferences can be formed as a result of two hierarchical, fluency-based processes. Our model suggests that processing performed immediately upon encountering an aesthetic object is stimulus driven, and aesthetic preferences that accrue from this processing reflect aesthetic evaluations of pleasure or displeasure. When sufficient processing motivation is provided by a perceiver's need for cognitive enrichment and/or the stimulus' processing affordance, elaborate perceiver-driven processing can emerge, which gives rise to fluency-based aesthetic evaluations of interest, boredom, or confusion. Because the positive outcomes in our model are pleasure and interest, we call it the Pleasure-Interest Model of Aesthetic Liking (PIA Model). Theoretically, this model integrates a dual-process perspective and ideas from lay epistemology into processing fluency theory, and it provides a parsimonious framework to embed and unite a wealth of aesthetic phenomena, including contradictory preference patterns for easy versus difficult-to-process aesthetic stimuli. © 2015 by the Society for Personality and Social Psychology, Inc.

  17. Comprehensive system models: Strategies for evaluation

    NASA Technical Reports Server (NTRS)

    Field, Christopher; Kutzbach, John E.; Ramanathan, V.; Maccracken, Michael C.

    1992-01-01

    The task of evaluating comprehensive earth system models is vast involving validations of every model component at every scale of organization, as well as tests of all the individual linkages. Even the most detailed evaluation of each of the component processes and the individual links among them should not, however, engender confidence in the performance of the whole. The integrated earth system is so rich with complex feedback loops, often involving components of the atmosphere, oceans, biosphere, and cryosphere, that it is certain to exhibit emergent properties very difficult to predict from the perspective of a narrow focus on any individual component of the system. Therefore, a substantial share of the task of evaluating comprehensive earth system models must reside at the level of whole system evaluations. Since complete, integrated atmosphere/ ocean/ biosphere/ hydrology models are not yet operational, questions of evaluation must be addressed at the level of the kinds of earth system processes that the models should be competent to simulate, rather than at the level of specific performance criteria. Here, we have tried to identify examples of earth system processes that are difficult to simulate with existing models and that involve a rich enough suite of feedbacks that they are unlikely to be satisfactorily described by highly simplified or toy models. Our purpose is not to specify a checklist of evaluation criteria but to introduce characteristics of the earth system that may present useful opportunities for model testing and, of course, improvement.

  18. Development of NASA's Sample Cartridge Assembly: Summary of GEDS Design, Development Testing, and Thermal Analyses

    NASA Technical Reports Server (NTRS)

    O'Connor, Brian; Hernandez, Deborah; Hornsby, Linda; Brown, Maria; Horton-Mullins, Kathryn

    2017-01-01

    Outline: Background of ISS (International Space Station) Material Science Research Rack; NASA SCA (Sample Cartridge Assembly) Design; GEDS (Gravitational Effects in Distortion in Sintering) Experiment Ampoule Design; Development Testing Summary; Thermal Modeling and Analysis. Summary: GEDS design development challenging (GEDS Ampoule design developed through MUGS (Microgravity) testing; Short duration transient sample processing; Unable to measure sample temperatures); MUGS Development testing used to gather data (Actual LGF (Low Gradient Furnace)-like furnace response; Provided sample for sintering evaluation); Transient thermal model integral to successful GEDS experiment (Development testing provided furnace response; PI (Performance Indicator) evaluation of sintering anchored model evaluation of processing durations; Thermal transient model used to determine flight SCA sample processing profiles).

  19. Evaluating the Risks: A Bernoulli Process Model of HIV Infection and Risk Reduction.

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Abramson, Paul R.

    1993-01-01

    A Bernoulli process model of human immunodeficiency virus (HIV) is used to evaluate infection risks associated with various sexual behaviors (condom use, abstinence, or monogamy). Results suggest that infection is best mitigated through measures that decrease infectivity, such as condom use. (SLD)

  20. A participatory evaluation model for Healthier Communities: developing indicators for New Mexico.

    PubMed Central

    Wallerstein, N

    2000-01-01

    Participatory evaluation models that invite community coalitions to take an active role in developing evaluations of their programs are a natural fit with Healthy Communities initiatives. The author describes the development of a participatory evaluation model for New Mexico's Healthier Communities program. She describes evaluation principles, research questions, and baseline findings. The evaluation model shows the links between process, community-level system impacts, and population health changes. PMID:10968754

  1. Disability Policy Evaluation: Combining Logic Models and Systems Thinking.

    PubMed

    Claes, Claudia; Ferket, Neelke; Vandevelde, Stijn; Verlet, Dries; De Maeyer, Jessica

    2017-07-01

    Policy evaluation focuses on the assessment of policy-related personal, family, and societal changes or benefits that follow as a result of the interventions, services, and supports provided to those persons to whom the policy is directed. This article describes a systematic approach to policy evaluation based on an evaluation framework and an evaluation process that combine the use of logic models and systems thinking. The article also includes an example of how the framework and process have recently been used in policy development and evaluation in Flanders (Belgium), as well as four policy evaluation guidelines based on relevant published literature.

  2. The Use of AMET & Automated Scripts for Model Evaluation

    EPA Science Inventory

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  3. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    NASA Astrophysics Data System (ADS)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  4. Evaluation of massless-spring modeling of suspension-line elasticity during the parachute unfurling process

    NASA Technical Reports Server (NTRS)

    Poole, L. R.; Huckins, E. K., III

    1972-01-01

    A general theory on mathematical modeling of elastic parachute suspension lines during the unfurling process was developed. Massless-spring modeling of suspension-line elasticity was evaluated in detail. For this simple model, equations which govern the motion were developed and numerically integrated. The results were compared with flight test data. In most regions, agreement was satisfactory. However, poor agreement was obtained during periods of rapid fluctuations in line tension.

  5. Institution Building and Evaluation.

    ERIC Educational Resources Information Center

    Wedemeyer, Charles A.

    Institutional modeling and program evaluation in relation to a correspondence program are discussed. The evaluation process is first considered from the viewpoint that it is an add-on activity, which is largely summative, and is the least desirable type of evaluation. Formative evaluation is next considered as a part of the process of institution…

  6. Maximizing the Impact of Program Evaluation: A Discrepancy-Based Process for Educational Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    This paper describes a formative/summative process for educational program evaluation, which is appropriate for higher education programs and is based on M. Provus' Discrepancy Evaluation Model and the principles of instructional design. The Discrepancy Based Methodology for Educational Program Evaluation facilitates systematic and detailed…

  7. Development of an estimation model for the evaluation of the energy requirement of dilute acid pretreatments of biomass.

    PubMed

    Mafe, Oluwakemi A T; Davies, Scott M; Hancock, John; Du, Chenyu

    2015-01-01

    This study aims to develop a mathematical model to evaluate the energy required by pretreatment processes used in the production of second generation ethanol. A dilute acid pretreatment process reported by National Renewable Energy Laboratory (NREL) was selected as an example for the model's development. The energy demand of the pretreatment process was evaluated by considering the change of internal energy of the substances, the reaction energy, the heat lost and the work done to/by the system based on a number of simplifying assumptions. Sensitivity analyses were performed on the solid loading rate, temperature, acid concentration and water evaporation rate. The results from the sensitivity analyses established that the solids loading rate had the most significant impact on the energy demand. The model was then verified with data from the NREL benchmark process. Application of this model on other dilute acid pretreatment processes reported in the literature illustrated that although similar sugar yields were reported by several studies, the energy required by the different pretreatments varied significantly.

  8. Evaluation of the energy efficiency of enzyme fermentation by mechanistic modeling.

    PubMed

    Albaek, Mads O; Gernaey, Krist V; Hansen, Morten S; Stocks, Stuart M

    2012-04-01

    Modeling biotechnological processes is key to obtaining increased productivity and efficiency. Particularly crucial to successful modeling of such systems is the coupling of the physical transport phenomena and the biological activity in one model. We have applied a model for the expression of cellulosic enzymes by the filamentous fungus Trichoderma reesei and found excellent agreement with experimental data. The most influential factor was demonstrated to be viscosity and its influence on mass transfer. Not surprisingly, the biological model is also shown to have high influence on the model prediction. At different rates of agitation and aeration as well as headspace pressure, we can predict the energy efficiency of oxygen transfer, a key process parameter for economical production of industrial enzymes. An inverse relationship between the productivity and energy efficiency of the process was found. This modeling approach can be used by manufacturers to evaluate the enzyme fermentation process for a range of different process conditions with regard to energy efficiency. Copyright © 2011 Wiley Periodicals, Inc.

  9. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  10. Benchmark simulation Model no 2 in Matlab-simulink: towards plant-wide WWTP control strategy evaluation.

    PubMed

    Vreck, D; Gernaey, K V; Rosen, C; Jeppsson, U

    2006-01-01

    In this paper, implementation of the Benchmark Simulation Model No 2 (BSM2) within Matlab-Simulink is presented. The BSM2 is developed for plant-wide WWTP control strategy evaluation on a long-term basis. It consists of a pre-treatment process, an activated sludge process and sludge treatment processes. Extended evaluation criteria are proposed for plant-wide control strategy assessment. Default open-loop and closed-loop strategies are also proposed to be used as references with which to compare other control strategies. Simulations indicate that the BM2 is an appropriate tool for plant-wide control strategy evaluation.

  11. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  12. A merged model of quality improvement and evaluation: maximizing return on investment.

    PubMed

    Woodhouse, Lynn D; Toal, Russ; Nguyen, Trang; Keene, DeAnna; Gunn, Laura; Kellum, Andrea; Nelson, Gary; Charles, Simone; Tedders, Stuart; Williams, Natalie; Livingood, William C

    2013-11-01

    Quality improvement (QI) and evaluation are frequently considered to be alternative approaches for monitoring and assessing program implementation and impact. The emphasis on third-party evaluation, particularly associated with summative evaluation, and the grounding of evaluation in the social and behavioral science contrast with an emphasis on the integration of QI process within programs or organizations and its origins in management science and industrial engineering. Working with a major philanthropic organization in Georgia, we illustrate how a QI model is integrated with evaluation for five asthma prevention and control sites serving poor and underserved communities in rural and urban Georgia. A primary foundation of this merged model of QI and evaluation is a refocusing of the evaluation from an intimidating report card summative evaluation by external evaluators to an internally engaged program focus on developmental evaluation. The benefits of the merged model to both QI and evaluation are discussed. The use of evaluation based logic models can help anchor a QI program in evidence-based practice and provide linkage between process and outputs with the longer term distal outcomes. Merging the QI approach with evaluation has major advantages, particularly related to enhancing the funder's return on investment. We illustrate how a Plan-Do-Study-Act model of QI can (a) be integrated with evaluation based logic models, (b) help refocus emphasis from summative to developmental evaluation, (c) enhance program ownership and engagement in evaluation activities, and (d) increase the role of evaluators in providing technical assistance and support.

  13. Development of a Nonlinear Soft-Sensor Using a GMDH Network for a Refinery Crude Distillation Tower

    NASA Astrophysics Data System (ADS)

    Fujii, Kenzo; Yamamoto, Toru

    In atmospheric distillation processes, the stabilization of processes is required in order to optimize the crude-oil composition that corresponds to product market conditions. However, the process control systems sometimes fall into unstable states in the case where unexpected disturbances are introduced, and these unusual phenomena have had an undesirable affect on certain products. Furthermore, a useful chemical engineering model has not yet been established for these phenomena. This remains a serious problem in the atmospheric distillation process. This paper describes a new modeling scheme to predict unusual phenomena in the atmospheric distillation process using the GMDH (Group Method of Data Handling) network which is one type of network model. According to the GMDH network, the model structure can be determined systematically. However, the least squares method has been commonly utilized in determining weight coefficients (model parameters). Estimation accuracy is not entirely expected, because the sum of squared errors between the measured values and estimates is evaluated. Therefore, instead of evaluating the sum of squared errors, the sum of absolute value of errors is introduced and the Levenberg-Marquardt method is employed in order to determine model parameters. The effectiveness of the proposed method is evaluated by the foaming prediction in the crude oil switching operation in the atmospheric distillation process.

  14. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    NASA Astrophysics Data System (ADS)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  15. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  16. Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration

    NASA Astrophysics Data System (ADS)

    Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.

    2017-12-01

    Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.

  17. Column Testing and 1D Reactive Transport Modeling to Evaluate Uranium Plume Persistence Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Raymond H.; Morrison, Stan; Morris, Sarah

    Motivation for Study: Natural flushing of contaminants at various U.S. Department of Energy Office of Legacy Management sites is not proceeding as quickly as predicted (plume persistence) Objectives: Help determine natural flushing rates using column tests. Use 1D reactive transport modeling to better understand the major processes that are creating plume persistence Approach: Core samples from under a former mill tailings area Tailings have been removed. Column leaching using lab-prepared water similar to nearby Gunnison River water. 1D reactive transport modeling to evaluate processes

  18. The Iterative Research Cycle: Process-Based Model Evaluation

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2014-12-01

    The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex physics based models that simulate a myriad of processes at different spatial and temporal scales. Reconciling these high-order system models with perpetually larger volumes of field data is becoming more and more difficult, particularly because classical likelihood-based fitting methods lack the power to detect and pinpoint deficiencies in the model structure. In this talk I will give an overview of our latest research on process-based model calibration and evaluation. This approach, rooted in Bayesian theory, uses summary metrics of the calibration data rather than the data itself to help detect which component(s) of the model is (are) malfunctioning and in need of improvement. A few case studies involving hydrologic and geophysical models will be used to demonstrate the proposed methodology.

  19. Two-structured solid particle model for predicting and analyzing supercritical extraction performance.

    PubMed

    Samadi, Sara; Vaziri, Behrooz Mahmoodzadeh

    2017-07-14

    Solid extraction process, using the supercritical fluid, is a modern science and technology, which has come in vogue regarding its considerable advantages. In the present article, a new and comprehensive model is presented for predicting the performance and separation yield of the supercritical extraction process. The base of process modeling is partial differential mass balances. In the proposed model, the solid particles are considered twofold: (a) particles with intact structure, (b) particles with destructed structure. A distinct mass transfer coefficient has been used for extraction of each part of solid particles to express different extraction regimes and to evaluate the process accurately (internal mass transfer coefficient was used for the intact-structure particles and external mass transfer coefficient was employed for the destructed-structure particles). In order to evaluate and validate the proposed model, the obtained results from simulations were compared with two series of available experimental data for extraction of chamomile extract with supercritical carbon dioxide, which had an excellent agreement. This is indicative of high potentiality of the model in predicting the extraction process, precisely. In the following, the effect of major parameters on supercritical extraction process, like pressure, temperature, supercritical fluid flow rate, and the size of solid particles was evaluated. The model can be used as a superb starting point for scientific and experimental applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. An investigative model evaluating how consumers process pictorial information on nonprescription medication labels.

    PubMed

    Sansgiry, S S; Cady, P S

    1997-01-01

    Currently, marketed over-the-counter (OTC) medication labels were simulated and tested in a controlled environment to understand consumer evaluation of OTC label information. Two factors, consumers' age (younger and older adults) and label designs (picture-only, verbal-only, congruent picture-verbal, and noncongruent picture-verbal) were controlled and tested to evaluate consumer information processing. The effects exerted by the independent variables, namely, comprehension of label information (understanding) and product evaluations (satisfaction, certainty, and perceived confusion) were evaluated on the dependent variable purchase intention. Intention measured as purchase recommendation was significantly related to product evaluations and affected by the factor label design. Participants' level of perceived confusion was more important than actual understanding of information on OTC medication labels. A Label Evaluation Process Model was developed which could be used for future testing of OTC medication labels.

  1. Equivalent model construction for a non-linear dynamic system based on an element-wise stiffness evaluation procedure and reduced analysis of the equivalent system

    NASA Astrophysics Data System (ADS)

    Kim, Euiyoung; Cho, Maenghyo

    2017-11-01

    In most non-linear analyses, the construction of a system matrix uses a large amount of computation time, comparable to the computation time required by the solving process. If the process for computing non-linear internal force matrices is substituted with an effective equivalent model that enables the bypass of numerical integrations and assembly processes used in matrix construction, efficiency can be greatly enhanced. A stiffness evaluation procedure (STEP) establishes non-linear internal force models using polynomial formulations of displacements. To efficiently identify an equivalent model, the method has evolved such that it is based on a reduced-order system. The reduction process, however, makes the equivalent model difficult to parameterize, which significantly affects the efficiency of the optimization process. In this paper, therefore, a new STEP, E-STEP, is proposed. Based on the element-wise nature of the finite element model, the stiffness evaluation is carried out element-by-element in the full domain. Since the unit of computation for the stiffness evaluation is restricted by element size, and since the computation is independent, the equivalent model can be constructed efficiently in parallel, even in the full domain. Due to the element-wise nature of the construction procedure, the equivalent E-STEP model is easily characterized by design parameters. Various reduced-order modeling techniques can be applied to the equivalent system in a manner similar to how they are applied in the original system. The reduced-order model based on E-STEP is successfully demonstrated for the dynamic analyses of non-linear structural finite element systems under varying design parameters.

  2. Development of a program logic model and evaluation plan for a participatory ergonomics intervention in construction.

    PubMed

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2014-03-01

    Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. © 2013 Wiley Periodicals, Inc.

  3. Development of a Program Logic Model and Evaluation Plan for a Participatory Ergonomics Intervention in Construction

    PubMed Central

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2013-01-01

    Background Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. Methods In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. Results The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Conclusions Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. PMID:24006097

  4. How Do You Evaluate Everyone Who Isn't a Teacher?

    ERIC Educational Resources Information Center

    Tucker, Pamela D.; Stronge, James H.

    1994-01-01

    Most states mandate evaluation of all certified employees, but most school systems lack a prescribed evaluation process for counselors, nurses, librarians, media specialists, and school psychologists. The Professional Support Personnel Evaluation Model defines a prescriptive, yet flexible seven-step process based on identifying system needs and…

  5. ARM - Midlatitude Continental Convective Clouds

    DOE Data Explorer

    Jensen, Mike; Bartholomew, Mary Jane; Genio, Anthony Del; Giangrande, Scott; Kollias, Pavlos

    2012-01-19

    Convective processes play a critical role in the Earth's energy balance through the redistribution of heat and moisture in the atmosphere and their link to the hydrological cycle. Accurate representation of convective processes in numerical models is vital towards improving current and future simulations of Earths climate system. Despite improvements in computing power, current operational weather and global climate models are unable to resolve the natural temporal and spatial scales important to convective processes and therefore must turn to parameterization schemes to represent these processes. In turn, parameterization schemes in cloud-resolving models need to be evaluated for their generality and application to a variety of atmospheric conditions. Data from field campaigns with appropriate forcing descriptors have been traditionally used by modelers for evaluating and improving parameterization schemes.

  6. ARM - Midlatitude Continental Convective Clouds (comstock-hvps)

    DOE Data Explorer

    Jensen, Mike; Comstock, Jennifer; Genio, Anthony Del; Giangrande, Scott; Kollias, Pavlos

    2012-01-06

    Convective processes play a critical role in the Earth's energy balance through the redistribution of heat and moisture in the atmosphere and their link to the hydrological cycle. Accurate representation of convective processes in numerical models is vital towards improving current and future simulations of Earths climate system. Despite improvements in computing power, current operational weather and global climate models are unable to resolve the natural temporal and spatial scales important to convective processes and therefore must turn to parameterization schemes to represent these processes. In turn, parameterization schemes in cloud-resolving models need to be evaluated for their generality and application to a variety of atmospheric conditions. Data from field campaigns with appropriate forcing descriptors have been traditionally used by modelers for evaluating and improving parameterization schemes.

  7. Systematic evaluation of atmospheric chemistry-transport model CHIMERE

    NASA Astrophysics Data System (ADS)

    Khvorostyanov, Dmitry; Menut, Laurent; Mailler, Sylvain; Siour, Guillaume; Couvidat, Florian; Bessagnet, Bertrand; Turquety, Solene

    2017-04-01

    Regional-scale atmospheric chemistry-transport models (CTM) are used to develop air quality regulatory measures, to support environmentally sensitive decisions in the industry, and to address variety of scientific questions involving the atmospheric composition. Model performance evaluation with measurement data is critical to understand their limits and the degree of confidence in model results. CHIMERE CTM (http://www.lmd.polytechnique.fr/chimere/) is a French national tool for operational forecast and decision support and is widely used in the international research community in various areas of atmospheric chemistry and physics, climate, and environment (http://www.lmd.polytechnique.fr/chimere/CW-articles.php). This work presents the model evaluation framework applied systematically to the new CHIMERE CTM versions in the course of the continuous model development. The framework uses three of the four CTM evaluation types identified by the Environmental Protection Agency (EPA) and the American Meteorological Society (AMS): operational, diagnostic, and dynamic. It allows to compare the overall model performance in subsequent model versions (operational evaluation), identify specific processes and/or model inputs that could be improved (diagnostic evaluation), and test the model sensitivity to the changes in air quality, such as emission reductions and meteorological events (dynamic evaluation). The observation datasets currently used for the evaluation are: EMEP (surface concentrations), AERONET (optical depths), and WOUDC (ozone sounding profiles). The framework is implemented as an automated processing chain and allows interactive exploration of the results via a web interface.

  8. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach), and iii) process-level evaluation at climate time-scales. The advantages and disadvantages of each approach will be identified and discussed, and some thoughts about possible future developments will be given.

  9. HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models

    NASA Astrophysics Data System (ADS)

    Melsen, Lieke A.; Teuling, Adriaan J.; Torfs, Paul J. J. F.; Uijlenhoet, Remko; Mizukami, Naoki; Clark, Martyn P.

    2016-03-01

    A meta-analysis on 192 peer-reviewed articles reporting on applications of the variable infiltration capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.

  10. HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models

    NASA Astrophysics Data System (ADS)

    Melsen, L. A.; Teuling, A. J.; Torfs, P. J. J. F.; Uijlenhoet, R.; Mizukami, N.; Clark, M. P.

    2015-12-01

    A meta-analysis on 192 peer-reviewed articles reporting applications of the Variable Infiltration Capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.

  11. A comprehensive model to evaluate implementation of the world health organization framework convention of tobacco control

    PubMed Central

    Sarrafzadegan, Nizal; Kelishad, Roya; Rabiei, Katayoun; Abedi, Heidarali; Mohaseli, Khadijeh Fereydoun; Masooleh, Hasan Azaripour; Alavi, Mousa; Heidari, Gholamreza; Ghaffari, Mostafa; O’Loughlin, Jennifer

    2012-01-01

    Background: Iran is one of the countries that has ratified the World Health Organization Framework Convention of Tobacco Control (WHO-FCTC), and has implemented a series of tobacco control interventions including the Comprehensive Tobacco Control Law. Enforcement of this legislation and assessment of its outcome requires a dedicated evaluation system. This study aimed to develop a generic model to evaluate the implementation of the Comprehensive Tobacco Control Law in Iran that was provided based on WHO-FCTC articles. Materials and Methods: Using a grounded theory approach, qualitative data were collected from 265 subjects in individual interviews and focus group discussions with policymakers who designed the legislation, key stakeholders, and members of the target community. In addition, field observations data in supermarkets/shops, restaurants, teahouses and coffee shops were collected. Data were analyzed in two stages through conceptual theoretical coding. Findings: Overall, 617 open codes were extracted from the data into tables; 72 level-3 codes were retained from the level-2 code series. Using a Model Met paradigm, the relationships between the components of each paradigm were depicted graphically. The evaluation model entailed three levels, namely: short-term results, process evaluation and long-term results. Conclusions: Central concept of the process of evaluation is that enforcing the law influences a variety of internal and environmental factors including legislative changes. These factors will be examined during the process evaluation and context evaluation. The current model can be applicable for providing FCTC evaluation tools across other jurisdictions. PMID:23833621

  12. VPPA weld model evaluation

    NASA Technical Reports Server (NTRS)

    Mccutcheon, Kimble D.; Gordon, Stephen S.; Thompson, Paul A.

    1992-01-01

    NASA uses the Variable Polarity Plasma Arc Welding (VPPAW) process extensively for fabrication of Space Shuttle External Tanks. This welding process has been in use at NASA since the late 1970's but the physics of the process have never been satisfactorily modeled and understood. In an attempt to advance the level of understanding of VPPAW, Dr. Arthur C. Nunes, Jr., (NASA) has developed a mathematical model of the process. The work described in this report evaluated and used two versions (level-0 and level-1) of Dr. Nunes' model, and a model derived by the University of Alabama at Huntsville (UAH) from Dr. Nunes' level-1 model. Two series of VPPAW experiments were done, using over 400 different combinations of welding parameters. Observations were made of VPPAW process behavior as a function of specific welding parameter changes. Data from these weld experiments was used to evaluate and suggest improvements to Dr. Nunes' model. Experimental data and correlations with the model were used to develop a multi-variable control algorithm for use with a future VPPAW controller. This algorithm is designed to control weld widths (both on the crown and root of the weld) based upon the weld parameters, base metal properties, and real-time observation of the crown width. The algorithm exhibited accuracy comparable to that of the weld width measurements for both aluminum and mild steel welds.

  13. Nuclear structure for SNe r- and neutrino processes

    NASA Astrophysics Data System (ADS)

    Suzuki, Toshio

    2014-09-01

    SNe r- and neutrino-processes are investigated based on recent advances in the studies of spin responses in nuclei. New shell-model Hamiltonians, which can well describe spin responses in nuclei with proper tensor components, are used to make accurate evaluations of reaction cross sections and rates in astrophysical processes. Nucleosyntheses in SNe r- and ν -processes as well as rp-processes are discussed with these new reaction rates with improved accuracies. (1) Beta-decay rates for N = 126 isotones are evaluated by shell-model calculations, and new rates are applied to study r-process nucleosynthesis in SNe's around its third peak as well as beyond the peak region up to uranium. (2) ν -processes for light-element synthesis in core-collapse SNe are studied with a new shell-model Hamiltonian in p-shell, SFO. Effects of MSW ν -oscillations on the production yields of 7Li and 11B and sensitivity of the yield ratio on ν -oscillation parameters are discussed. ν -induced reactions on 16O are also studied. (3) A new shell-model Hamiltonian in pf-shell, GXPF1J, is used to evaluate e-capture rates in pf-shell nuclei at stellar environments. New e-capture rates are applied to study nucleosynthesis in type-Ia supernova explosions, rp-process and X-ray bursts.

  14. GANViz: A Visual Analytics Approach to Understand the Adversarial Game.

    PubMed

    Wang, Junpeng; Gou, Liang; Yang, Hao; Shen, Han-Wei

    2018-06-01

    Generative models bear promising implications to learn data representations in an unsupervised fashion with deep learning. Generative Adversarial Nets (GAN) is one of the most popular frameworks in this arena. Despite the promising results from different types of GANs, in-depth understanding on the adversarial training process of the models remains a challenge to domain experts. The complexity and the potential long-time training process of the models make it hard to evaluate, interpret, and optimize them. In this work, guided by practical needs from domain experts, we design and develop a visual analytics system, GANViz, aiming to help experts understand the adversarial process of GANs in-depth. Specifically, GANViz evaluates the model performance of two subnetworks of GANs, provides evidence and interpretations of the models' performance, and empowers comparative analysis with the evidence. Through our case studies with two real-world datasets, we demonstrate that GANViz can provide useful insight into helping domain experts understand, interpret, evaluate, and potentially improve GAN models.

  15. A Comprehensive Model for Developing and Evaluating Study Abroad Programs in Counselor Education

    ERIC Educational Resources Information Center

    Santos, Syntia Dinora

    2014-01-01

    This paper introduces a model to guide the process of designing and evaluating study abroad programs, addressing particular stages and influential factors. The main purpose of the model is to serve as a basic structure for those who want to develop their own program or evaluate previous cultural immersion experiences. The model is based on the…

  16. Exploring Secondary Students' Epistemological Features Depending on the Evaluation Levels of the Group Model on Blood Circulation

    ERIC Educational Resources Information Center

    Lee, Shinyoung; Kim, Heui-Baik

    2014-01-01

    The purpose of this study is to identify the epistemological features and model qualities depending on model evaluation levels and to explore the reasoning process behind high-level evaluation through small group interaction about blood circulation. Nine groups of three to four students in the eighth grade participated in the modeling practice.…

  17. Effects of children's working memory capacity and processing speed on their sentence imitation performance.

    PubMed

    Poll, Gerard H; Miller, Carol A; Mainela-Arnold, Elina; Adams, Katharine Donnelly; Misra, Maya; Park, Ji Sook

    2013-01-01

    More limited working memory capacity and slower processing for language and cognitive tasks are characteristics of many children with language difficulties. Individual differences in processing speed have not consistently been found to predict language ability or severity of language impairment. There are conflicting views on whether working memory and processing speed are integrated or separable abilities. To evaluate four models for the relations of individual differences in children's processing speed and working memory capacity in sentence imitation. The models considered whether working memory and processing speed are integrated or separable, as well as the effect of the number of operations required per sentence. The role of working memory as a mediator of the effect of processing speed on sentence imitation was also evaluated. Forty-six children with varied language and reading abilities imitated sentences. Working memory was measured with the Competing Language Processing Task (CLPT), and processing speed was measured with a composite of truth-value judgment and rapid automatized naming tasks. Mixed-effects ordinal regression models evaluated the CLPT and processing speed as predictors of sentence imitation item scores. A single mediator model evaluated working memory as a mediator of the effect of processing speed on sentence imitation total scores. Working memory was a reliable predictor of sentence imitation accuracy, but processing speed predicted sentence imitation only as a component of a processing speed by number of operations interaction. Processing speed predicted working memory capacity, and there was evidence that working memory acted as a mediator of the effect of processing speed on sentence imitation accuracy. The findings support a refined view of working memory and processing speed as separable factors in children's sentence imitation performance. Processing speed does not independently explain sentence imitation accuracy for all sentence types, but contributes when the task requires more mental operations. Processing speed also has an indirect effect on sentence imitation by contributing to working memory capacity. © 2013 Royal College of Speech and Language Therapists.

  18. Evaluation of shrinking core model in leaching process of Pomalaa nickel laterite using citric acid as leachant at atmospheric conditions

    NASA Astrophysics Data System (ADS)

    Wanta, K. C.; Perdana, I.; Petrus, H. T. B. M.

    2016-11-01

    Most of kinetics studies related to leaching process used shrinking core model to describe physical phenomena of the process. Generally, the model was developed in connection with transport and/or reaction of reactant components. In this study, commonly used internal diffusion controlled shrinking core model was evaluated for leaching process of Pomalaa nickel laterite using citric acid as leachant. Particle size was varied at 60-70, 100-120, -200 meshes, while the operating temperature was kept constant at 358 K, citric acid concentration at 0.1 M, pulp density at 20% w/v and the leaching time was for 120 minutes. Simulation results showed that the shrinking core model was inadequate to closely approach the experimental data. Meanwhile, the experimental data indicated that the leaching process was determined by the mobility of product molecules in the ash layer pores. In case of leaching resulting large product molecules, a mathematical model involving steps of reaction and product diffusion might be appropriate to develop.

  19. Dynamic modeling and validation of a lignocellulosic enzymatic hydrolysis process--a demonstration scale study.

    PubMed

    Prunescu, Remus Mihail; Sin, Gürkan

    2013-12-01

    The enzymatic hydrolysis process is one of the key steps in second generation biofuel production. After being thermally pretreated, the lignocellulosic material is liquefied by enzymes prior to fermentation. The scope of this paper is to evaluate a dynamic model of the hydrolysis process on a demonstration scale reactor. The following novel features are included: the application of the Convection-Diffusion-Reaction equation to a hydrolysis reactor to assess transport and mixing effects; the extension of a competitive kinetic model with enzymatic pH dependency and hemicellulose hydrolysis; a comprehensive pH model; and viscosity estimations during the course of reaction. The model is evaluated against real data extracted from a demonstration scale biorefinery throughout several days of operation. All measurements are within predictions uncertainty and, therefore, the model constitutes a valuable tool to support process optimization, performance monitoring, diagnosis and process control at full-scale studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Revitalizing Adversary Evaluation: Deep Dark Deficits or Muddled Mistaken Musings

    ERIC Educational Resources Information Center

    Thurston, Paul

    1978-01-01

    The adversary evaluation model consists of utilizing the judicial process as a metaphor for educational evaluation. In this article, previous criticism of the model is addressed and its fundamental problems are detailed. It is speculated that the model could be improved by borrowing ideas from other legal forms of inquiry. (Author/GC)

  1. An evaluation of Dynamic TOPMODEL for low flow simulation

    NASA Astrophysics Data System (ADS)

    Coxon, G.; Freer, J. E.; Quinn, N.; Woods, R. A.; Wagener, T.; Howden, N. J. K.

    2015-12-01

    Hydrological models are essential tools for drought risk management, often providing input to water resource system models, aiding our understanding of low flow processes within catchments and providing low flow predictions. However, simulating low flows and droughts is challenging as hydrological systems often demonstrate threshold effects in connectivity, non-linear groundwater contributions and a greater influence of water resource system elements during low flow periods. These dynamic processes are typically not well represented in commonly used hydrological models due to data and model limitations. Furthermore, calibrated or behavioural models may not be effectively evaluated during more extreme drought periods. A better understanding of the processes that occur during low flows and how these are represented within models is thus required if we want to be able to provide robust and reliable predictions of future drought events. In this study, we assess the performance of dynamic TOPMODEL for low flow simulation. Dynamic TOPMODEL was applied to a number of UK catchments in the Thames region using time series of observed rainfall and potential evapotranspiration data that captured multiple historic droughts over a period of several years. The model performance was assessed against the observed discharge time series using a limits of acceptability framework, which included uncertainty in the discharge time series. We evaluate the models against multiple signatures of catchment low-flow behaviour and investigate differences in model performance between catchments, model diagnostics and for different low flow periods. We also considered the impact of surface water and groundwater abstractions and discharges on the observed discharge time series and how this affected the model evaluation. From analysing the model performance, we suggest future improvements to Dynamic TOPMODEL to improve the representation of low flow processes within the model structure.

  2. Evaluation of parameters of color profile models of LCD and LED screens

    NASA Astrophysics Data System (ADS)

    Zharinov, I. O.; Zharinov, O. O.

    2017-12-01

    The purpose of the research relates to the problem of parametric identification of the color profile model of LCD (liquid crystal display) and LED (light emitting diode) screens. The color profile model of a screen is based on the Grassmann’s Law of additive color mixture. Mathematically the problem is to evaluate unknown parameters (numerical coefficients) of the matrix transformation between different color spaces. Several methods of evaluation of these screen profile coefficients were developed. These methods are based either on processing of some colorimetric measurements or on processing of technical documentation data.

  3. The evolution of an evaluation: a case study using the tribal participatory research model.

    PubMed

    Richmond, Lucinda S; Peterson, Donna J; Betts, Sherry C

    2008-10-01

    This article presents a case study of how the evaluation design for a dating violence prevention and/or youth development program for American Indian youth in Arizona evolved throughout the project. Particular attention is given to how the evaluation design was guided by the tribal participatory research model. A brief rationale for the project is presented along with literature on culturally competent evaluation and research with American Indians. A description of the project and the unique communities in which it was implemented is provided. The focus of the article is the process of how the evaluation plan changed and how various factors influenced this process (e.g., feedback from community stakeholders, conversations with funder, results of process evaluation, suggestions from literature, the authors' experience working in American Indian communities). The authors conclude with lessons learned for others to consider as they develop working relationships and evaluation plans in similar communities.

  4. Application of a responsive evaluation approach in medical education.

    PubMed

    Curran, Vernon; Christopher, Jeanette; Lemire, Francine; Collins, Alice; Barrett, Brendan

    2003-03-01

    This paper reports on the usefulness of a responsive evaluation model in evaluating the clinical skills assessment and training (CSAT) programme at the Faculty of Medicine, Memorial University of Newfoundland, Canada. The purpose of this paper is to introduce the responsive evaluation approach, ascertain its utility, feasibility, propriety and accuracy in a medical education context, and discuss its applicability as a model for medical education programme evaluation. Robert Stake's original 12-step responsive evaluation model was modified and reduced to five steps, including: (1) stakeholder audience identification, consultation and issues exploration; (2) stakeholder concerns and issues analysis; (3) identification of evaluative standards and criteria; (4) design and implementation of evaluation methodology; and (5) data analysis and reporting. This modified responsive evaluation process was applied to the CSAT programme and a meta-evaluation was conducted to evaluate the effectiveness of the approach. The responsive evaluation approach was useful in identifying the concerns and issues of programme stakeholders, solidifying the standards and criteria for measuring the success of the CSAT programme, and gathering rich and descriptive evaluative information about educational processes. The evaluation was perceived to be human resource dependent in nature, yet was deemed to have been practical, efficient and effective in uncovering meaningful and useful information for stakeholder decision-making. Responsive evaluation is derived from the naturalistic paradigm and concentrates on examining the educational process rather than predefined outcomes of the process. Responsive evaluation results are perceived as having more relevance to stakeholder concerns and issues, and therefore more likely to be acted upon. Conducting an evaluation that is responsive to the needs of these groups will ensure that evaluative information is meaningful and more likely to be used for programme enhancement and improvement.

  5. Applying the Quadruple Process Model to Evaluate Change in Implicit Attitudinal Responses During Therapy for Panic Disorder

    PubMed Central

    Clerkin, Elise M.; Fisher, Christopher R.; Sherman, Jeffrey W.; Teachman, Bethany A.

    2013-01-01

    Objective This study explored the automatic and controlled processes that may influence performance on an implicit measure across cognitive-behavioral group therapy for panic disorder. Method The Quadruple Process model was applied to error scores from an Implicit Association Test evaluating associations between the concepts Me (vs. Not Me) + Calm (vs. Panicked) to evaluate four distinct processes: Association Activation, Detection, Guessing, and Overcoming Bias. Parameter estimates were calculated in the panic group (n=28) across each treatment session where the IAT was administered, and at matched times when the IAT was completed in the healthy control group (n=31). Results Association Activation for Me + Calm became stronger over treatment for participants in the panic group, demonstrating that it is possible to change automatically activated associations in memory (vs. simply overriding those associations) in a clinical sample via therapy. As well, the Guessing bias toward the calm category increased over treatment for participants in the panic group. Conclusions This research evaluates key tenets about the role of automatic processing in cognitive models of anxiety, and emphasizes the viability of changing the actual activation of automatic associations in the context of treatment, versus only changing a person’s ability to use reflective processing to overcome biased automatic processing. PMID:24275066

  6. Enhanced modeling and simulation of EO/IR sensor systems

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Miller, Brian; May, Christopher

    2015-05-01

    The testing and evaluation process developed by the Night Vision and Electronic Sensors Directorate (NVESD) Modeling and Simulation Division (MSD) provides end to end systems evaluation, testing, and training of EO/IR sensors. By combining NV-LabCap, the Night Vision Integrated Performance Model (NV-IPM), One Semi-Automated Forces (OneSAF) input sensor file generation, and the Night Vision Image Generator (NVIG) capabilities, NVESD provides confidence to the M&S community that EO/IR sensor developmental and operational testing and evaluation are accurately represented throughout the lifecycle of an EO/IR system. This new process allows for both theoretical and actual sensor testing. A sensor can be theoretically designed in NV-IPM, modeled in NV-IPM, and then seamlessly input into the wargames for operational analysis. After theoretical design, prototype sensors can be measured by using NV-LabCap, then modeled in NV-IPM and input into wargames for further evaluation. The measurement process to high fidelity modeling and simulation can then be repeated again and again throughout the entire life cycle of an EO/IR sensor as needed, to include LRIP, full rate production, and even after Depot Level Maintenance. This is a prototypical example of how an engineering level model and higher level simulations can share models to mutual benefit.

  7. Error Analysis Of Students Working About Word Problem Of Linear Program With NEA Procedure

    NASA Astrophysics Data System (ADS)

    Santoso, D. A.; Farid, A.; Ulum, B.

    2017-06-01

    Evaluation and assessment is an important part of learning. In evaluation process of learning, written test is still commonly used. However, the tests usually do not following-up by further evaluation. The process only up to grading stage not to evaluate the process and errors which done by students. Whereas if the student has a pattern error and process error, actions taken can be more focused on the fault and why is that happen. NEA procedure provides a way for educators to evaluate student progress more comprehensively. In this study, students’ mistakes in working on some word problem about linear programming have been analyzed. As a result, mistakes are often made students exist in the modeling phase (transformation) and process skills (process skill) with the overall percentage distribution respectively 20% and 15%. According to the observations, these errors occur most commonly due to lack of precision of students in modeling and in hastiness calculation. Error analysis with students on this matter, it is expected educators can determine or use the right way to solve it in the next lesson.

  8. Modeling and evaluating user behavior in exploratory visual analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.

    Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling andmore » evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.« less

  9. EVALUATION OF MULTIPLE PHARMACOKINETIC MODELING STRUCTURES FOR TRICHLOROETHYLENE

    EPA Science Inventory

    A series of PBPK models were developed for trichloroethylene (TCE) to evaluate biological processes that may affect the absorption, distribution, metabolism and excretion (ADME) of TCE and its metabolites.

  10. IT vendor selection model by using structural equation model & analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  11. Application of Context Input Process and Product Model in Curriculum Evaluation: Case Study of a Call Centre

    ERIC Educational Resources Information Center

    Kavgaoglu, Derya; Alci, Bülent

    2016-01-01

    The goal of this research which was carried out in reputable dedicated call centres within the Turkish telecommunication sector aims is to evaluate competence-based curriculums designed by means of internal funding through Stufflebeam's context, input, process, product (CIPP) model. In the research, a general scanning pattern in the scope of…

  12. High Tech Educators Network Evaluation.

    ERIC Educational Resources Information Center

    O'Shea, Dan

    A process evaluation was conducted to assess the High Tech Educators Network's (HTEN's) activities. Four basic components to the evaluation approach were documentation review, program logic model, written survey, and participant interviews. The model mapped the basic goals and objectives, assumptions, activities, outcome expectations, and…

  13. Models of Evaluation Utilization: A Meta-Modeling Synthesis of the Literature.

    ERIC Educational Resources Information Center

    Johnson, R. Burke

    An integrative causal process model of evaluation utilization variables is presented. The model was developed through a traditional approach to literature review that lists results from published studies and relates these to the research topic, and through an approach that tries to integrate the models found in the literature search. Meta-modeling…

  14. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  15. Model-based evaluation of two BNR processes--UCT and A2N.

    PubMed

    Hao, X; Van Loosdrecht, M C; Meijer, S C; Qian, Y

    2001-08-01

    The activity of denitrifying P-accumulating bacteria (DPB) has been verified to exist in most WWTPs with biological nutrient removal (BNR). The modified UCT process has a high content of DPB. A new BNR process with a two-sludge system named A2N was especially developed to exploit denitrifying dephosphatation. With the identical inflow and effluent standards, an existing full-scale UCT-type WWTP and a designed A2N process were evaluated by simulation. The used model is based on the Delft metabolical model for bio-P removal and ASM2d model for COD and N removal. Both processes accommodate denitrifying dephosphatation, but the A2N process has a more stable performance in N removal. Although excess sludge is increased by 6%, the A2N process leads to savings of 35, 85 and 30% in aeration energy, mixed liquor internal recirculation and land occupation respectively, as compared to the UCT process. Low temperature has a negative effect on growth of poly-P bacteria, which becomes to especially appear in the A2N process.

  16. The comparative evaluation of expanded national immunization policies in Korea using an analytic hierarchy process.

    PubMed

    Shin, Taeksoo; Kim, Chun-Bae; Ahn, Yang-Heui; Kim, Hyo-Youl; Cha, Byung Ho; Uh, Young; Lee, Joo-Heon; Hyun, Sook-Jung; Lee, Dong-Han; Go, Un-Yeong

    2009-01-29

    The purpose of this paper is to propose new evaluation criteria and an analytic hierarchy process (AHP) model to assess the expanded national immunization programs (ENIPs) and to evaluate two alternative health care policies. One of the alternative policies is that private clinics and hospitals would offer free vaccination services to children and the other of them is that public health centers would offer these free vaccination services. Our model to evaluate the ENIPs was developed using brainstorming, Delphi techniques, and the AHP model. We first used the brainstorming and Delphi techniques, as well as literature reviews, to determine 25 criteria with which to evaluate the national immunization policy; we then proposed a hierarchical structure of the AHP model to assess ENIPs. By applying the proposed AHP model to the assessment of ENIPs for Korean immunization policies, we show that free vaccination services should be provided by private clinics and hospitals rather than public health centers.

  17. EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.

    EPA Science Inventory

    This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...

  18. Social Information Processing Mediates the Intergenerational Transmission of Aggressiveness in Romantic Relationships

    PubMed Central

    Fite, Jennifer E.; Bates, John E.; Holtzworth-Munroe, Amy; Dodge, Kenneth A.; Nay, Sandra Y.; Pettit, Gregory S.

    2012-01-01

    This study explored the K. A. Dodge (1986) model of social information processing as a mediator of the association between interparental relationship conflict and subsequent offspring romantic relationship conflict in young adulthood. The authors tested 4 social information processing stages (encoding, hostile attributions, generation of aggressive responses, and positive evaluation of aggressive responses) in separate models to explore their independent effects as potential mediators. There was no evidence of mediation for encoding and attributions. However, there was evidence of significant mediation for both the response generation and response evaluation stages of the model. Results suggest that the ability of offspring to generate varied social responses and effectively evaluate the potential outcome of their responses at least partially mediates the intergenerational transmission of relationship conflict. PMID:18540765

  19. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  20. Evaluation of Boreal Summer Monsoon Intraseasonal Variability in the GASS-YOTC Multi-Model Physical Processes Experiment

    NASA Astrophysics Data System (ADS)

    Mani, N. J.; Waliser, D. E.; Jiang, X.

    2014-12-01

    While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.

  1. Peer Evaluation of Teaching in an Online Information Literacy Course

    ERIC Educational Resources Information Center

    Vega García, Susan A.; Stacy-Bates, Kristine K.; Alger, Jeff; Marupova, Rano

    2017-01-01

    This paper reports on the development and implementation of a process of peer evaluation of teaching to assess librarian instruction in a high-enrollment online information literacy course for undergraduates. This paper also traces a shift within libraries from peer coaching to peer evaluation models. One common model for peer evaluation, using…

  2. Dig into Learning: A Program Evaluation of an Agricultural Literacy Innovation

    ERIC Educational Resources Information Center

    Edwards, Erica Brown

    2016-01-01

    This study is a mixed-methods program evaluation of an agricultural literacy innovation in a local school district in rural eastern North Carolina. This evaluation describes the use of a theory-based framework, the Concerns-Based Adoption Model (CBAM), in accordance with Stufflebeam's Context, Input, Process, Product (CIPP) model by evaluating the…

  3. Evaluation of Turkish and Mathematics Curricula According to Value-Based Evaluation Model

    ERIC Educational Resources Information Center

    Duman, Serap Nur; Akbas, Oktay

    2017-01-01

    This study evaluated secondary school seventh-grade Turkish and mathematics programs using the Context-Input-Process-Product Evaluation Model based on student, teacher, and inspector views. The convergent parallel mixed method design was used in the study. Student values were identified using the scales for socio-level identification, traditional…

  4. Evaluation of Career Guidance Programs: Models, Methods, and Microcomputers. Information Series No. 317.

    ERIC Educational Resources Information Center

    Crites, John O.

    Evaluating the effectiveness of career guidance programs is a complex process, and few comprehensive models for evaluating such programs exist. Evaluation of career guidance programs has been hampered by the myth that program outcomes are uniform and monolithic. Findings from studies of attribute treatment interactions have revealed only a few…

  5. A trust region approach with multivariate Padé model for optimal circuit design

    NASA Astrophysics Data System (ADS)

    Abdel-Malek, Hany L.; Ebid, Shaimaa E. K.; Mohamed, Ahmed S. A.

    2017-11-01

    Since the optimization process requires a significant number of consecutive function evaluations, it is recommended to replace the function by an easily evaluated approximation model during the optimization process. The model suggested in this article is based on a multivariate Padé approximation. This model is constructed using data points of ?, where ? is the number of parameters. The model is updated over a sequence of trust regions. This model avoids the slow convergence of linear models of ? and has features of quadratic models that need interpolation data points of ?. The proposed approach is tested by applying it to several benchmark problems. Yield optimization using such a direct method is applied to some practical circuit examples. Minimax solution leads to a suitable initial point to carry out the yield optimization process. The yield is optimized by the proposed derivative-free method for active and passive filter examples.

  6. A participatory evaluation framework in the establishment and implementation of transdisciplinary collaborative centers for health disparities research.

    PubMed

    Scarinci, Isabel C; Moore, Artisha; Benjamin, Regina; Vickers, Selwyn; Shikany, James; Fouad, Mona

    2017-02-01

    We describe the formulation and implementation of a participatory evaluation plan for three Transdisciplinary Collaborative Centers for Health Disparities Research funded by the National Institute of Minority Health and Health Disparities. Although different in scope of work, all three centers share a common goal of establishing sustainable centers in health disparities science in three priority areas - social determinants of health, men's health research, and health policy research. The logic model guides the process, impact, and outcome evaluation. Emphasis is placed on process evaluation in order to establish a "blue print" that can guide other efforts as well as assure that activities are being implemented as planned. We have learned three major lessons in this process: (1) Significant engagement, participation, and commitment of all involved is critical for the evaluation process; (2) Having a "roadmap" (logic model) and "directions" (evaluation worksheets) are instrumental in getting members from different backgrounds to follow the same path; and (3) Participation of the evaluator in the leadership and core meetings facilitates continuous feedback. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Modeling the effect of land use change on hydrology of a forested watershed in coastal South Carolina.

    Treesearch

    Zhaohua Dai; Devendra M. Amatya; Ge Sun; Changsheng Li; Carl C. Trettin; Harbin Li

    2009-01-01

    Since hydrology is one of main factors controlling wetland functions, hydrologic models are useful for evaluating the effects of land use change on we land ecosystems. We evaluated two process-based hydrologic models with...

  8. AQMEII: A New International Initiative on Air Quality Model Evaluation

    EPA Science Inventory

    We provide a conceptual view of the process of evaluating regional-scale three-dimensional numerical photochemical air quality modeling system, based on an examination of existing approached to the evaluation of such systems as they are currently used in a variety of application....

  9. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    PubMed

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  10. Evaluating a process-based model for use in streambank stabilization and stream restoration: insights on the bank stability and toe erosion model (BSTEM)

    USDA-ARS?s Scientific Manuscript database

    Streambank retreat is a complex cyclical process involving subaerial processes, fluvial erosion, seepage erosion, and geotechnical failures and is driven by several soil properties that themselves are temporally and spatially variable. Therefore, it can be extremely challenging to predict and model ...

  11. Brief Lags in Interrupted Sequential Performance: Evaluating a Model and Model Evaluation Method

    DTIC Science & Technology

    2015-01-05

    rehearsal mechanism in the model. To evaluate the model we developed a simple new goodness-of-fit test based on analysis of variance that offers an...repeated step). Sequen- tial constraints are common in medicine, equipment maintenance, computer programming and technical support, data analysis ...legal analysis , accounting, and many other home and workplace environ- ments. Sequential constraints also play a role in such basic cognitive processes

  12. Using the Context, Input, Process, and Product Evaluation Model (CIPP) as a Comprehensive Framework to Guide the Planning, Implementation, and Assessment of Service-Learning Programs

    ERIC Educational Resources Information Center

    Zhang, Guili; Zeller, Nancy; Griffith, Robin; Metcalf, Debbie; Williams, Jennifer; Shea, Christine; Misulis, Katherine

    2011-01-01

    Planning, implementing, and assessing a service-learning project can be a complex task because service-learning projects often involve multiple constituencies and aim to meet both the needs of service providers and community partners. In this article, Stufflebeam's Context, Input, Process, and Product (CIPP) evaluation model is recommended as a…

  13. Evaluation of English as a Foreign Language Program--Using CIPP (Context, Input, Process and Product) Model

    ERIC Educational Resources Information Center

    Ulum, Ömer Gökhan

    2016-01-01

    The aim of this study is to evaluate a state high school EFL Program through CIPP (context, input, process and product) model. The participants of the study include 504 students. The source of data has been obtained through a 46-itemed questionnaire and an interview for the students. In the study, the data has been analysed using statistical…

  14. Using the hydrologic model mike she to assess disturbance impacts on watershed process and responses across the Southeastern U.S.

    Treesearch

    Ge Sun; Jianbiao Lu; Steven G. McNulty; James M. Vose; Devendra M. Amayta

    2006-01-01

    A clear understanding of the basic hydrologic processes is needed to restore and manage watersheds across the diverse physiologic gradients in the Southeastern U.S. We evaluated a physically based, spatially distributed watershed hydrologic model called MIKE SHE/MIKE 11 to evaluate disturbance impacts on water use and yield across the region. Long-term forest...

  15. Evaluation on Social Internship Program of Iain Sultan Thaha Saifuddin Jambi Students: Using Context, Input, Process and Product Model (CIPP Model)

    ERIC Educational Resources Information Center

    Hurmaini, M.; Abdillah

    2015-01-01

    The purpose of the research is to know the context, input, process and product evaluation on the Social Internship Program (Kukerta) of IAIN Sulthan Thaha Saifuddin Jambi Students by using Participatory Action Research (PAR) system. The research is conducted in four locations of IAIN Sultan Thaha Saifuddin Jambi students' Kukerta in first period…

  16. A Process Model of Principal Selection.

    ERIC Educational Resources Information Center

    Flanigan, J. L.; And Others

    A process model to assist school district superintendents in the selection of principals is presented in this paper. Components of the process are described, which include developing an action plan, formulating an explicit job description, advertising, assessing candidates' philosophy, conducting interview analyses, evaluating response to stress,…

  17. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  18. A General Accelerated Degradation Model Based on the Wiener Process.

    PubMed

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  19. A General Accelerated Degradation Model Based on the Wiener Process

    PubMed Central

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107

  20. New model framework and structure and the commonality evaluation model. [concerning unmanned spacecraft projects

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.

  1. Measuring the effect of attention on simple visual search.

    PubMed

    Palmer, J; Ames, C T; Lindsey, D T

    1993-02-01

    Set-size in visual search may be due to 1 or more of 3 factors: sensory processes such as lateral masking between stimuli, attentional processes limiting the perception of individual stimuli, or attentional processes affecting the decision rules for combining information from multiple stimuli. These possibilities were evaluated in tasks such as searching for a longer line among shorter lines. To evaluate sensory contributions, display set-size effects were compared with cuing conditions that held sensory phenomena constant. Similar effects for the display and cue manipulations suggested that sensory processes contributed little under the conditions of this experiment. To evaluate the contribution of decision processes, the set-size effects were modeled with signal detection theory. In these models, a decision effect alone was sufficient to predict the set-size effects without any attentional limitation due to perception.

  2. Evaluation of the Combined AERCOARE/AERMOD Modeling Approach for Offshore Sources

    EPA Science Inventory

    ENVIRON conducted an evaluation of the combined AERCOARE/AERMOD (AERCOARE-MOD) modeling approach for offshore sources using tracer data from four field studies. AERCOARE processes overwater meteorological data for use by the AERMOD air quality dispersion model (EPA, 2004a). AERC...

  3. SENSITIVE PARAMETER EVALUATION FOR A VADOSE ZONE FATE AND TRANSPORT MODEL

    EPA Science Inventory

    This report presents information pertaining to quantitative evaluation of the potential impact of selected parameters on output of vadose zone transport and fate models used to describe the behavior of hazardous chemicals in soil. The Vadose 2one Interactive Processes (VIP) model...

  4. Evaluating the sources of water to wells: Three techniques for metamodeling of a groundwater flow model

    USGS Publications Warehouse

    Fienen, Michael N.; Nolan, Bernard T.; Feinstein, Daniel T.

    2016-01-01

    For decision support, the insights and predictive power of numerical process models can be hampered by insufficient expertise and computational resources required to evaluate system response to new stresses. An alternative is to emulate the process model with a statistical “metamodel.” Built on a dataset of collocated numerical model input and output, a groundwater flow model was emulated using a Bayesian Network, an Artificial neural network, and a Gradient Boosted Regression Tree. The response of interest was surface water depletion expressed as the source of water-to-wells. The results have application for managing allocation of groundwater. Each technique was tuned using cross validation and further evaluated using a held-out dataset. A numerical MODFLOW-USG model of the Lake Michigan Basin, USA, was used for the evaluation. The performance and interpretability of each technique was compared pointing to advantages of each technique. The metamodel can extend to unmodeled areas.

  5. A Conceptual Framework for Evaluating Higher Education Institutions

    ERIC Educational Resources Information Center

    Chinta, Ravi; Kebritchi, Mansureh; Ellias, Janelle

    2016-01-01

    Purpose: Performance evaluation is a topic that has been researched and practiced extensively in business organizations but has received scant attention in higher education institutions. A review of literature revealed that context, input, process, product (CIPP) model is an appropriate performance evaluation model for higher education…

  6. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  7. Evaluating Aerosol Process Modules within the Framework of the Aerosol Modeling Testbed

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Velu, V.; Gustafson, W. I.; Chapman, E.; Easter, R. C.; Shrivastava, M.; Singh, B.

    2012-12-01

    Factors that influence predictions of aerosol direct and indirect forcing, such as aerosol mass, composition, size distribution, hygroscopicity, and optical properties, still contain large uncertainties in both regional and global models. New aerosol treatments are usually implemented into a 3-D atmospheric model and evaluated using a limited number of measurements from a specific case study. Under this modeling paradigm, the performance and computational efficiency of several treatments for a specific aerosol process cannot be adequately quantified because many other processes among various modeling studies (e.g. grid configuration, meteorology, emission rates) are different as well. The scientific community needs to know the advantages and disadvantages of specific aerosol treatments when the meteorology, chemistry, and other aerosol processes are identical in order to reduce the uncertainties associated with aerosols predictions. To address these issues, an Aerosol Modeling Testbed (AMT) has been developed that systematically and objectively evaluates new aerosol treatments for use in regional and global models. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from the Community Atmosphere Model version 5 (CAM5) have also been ported to WRF so that they can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. Data from several campaigns, including the 2006 MILAGRO, 2008 ISDAC, 2008 VOCALS, 2010 CARES, and 2010 CalNex campaigns, have been incorporated into the AMT as testbed cases. Data from operational networks (e.g. air quality, meteorology, satellite) are also included in the testbed cases to supplement the field campaign data. The CARES and CalNex testbed cases are used to demonstrate how the AMT can be used to assess the strengths and weaknesses of simple and complex representations of aerosol processes in relation to computational cost. Anticipated enhancements to the AMT and how this type of testbed can be used by the scientific community to foster collaborations and coordinate aerosol modeling research will also be discussed.

  8. Dynamic Modeling of Yield and Particle Size Distribution in Continuous Bayer Precipitation

    NASA Astrophysics Data System (ADS)

    Stephenson, Jerry L.; Kapraun, Chris

    Process engineers at Alcoa's Point Comfort refinery are using a dynamic model of the Bayer precipitation area to evaluate options in operating strategies. The dynamic model, a joint development effort between Point Comfort and the Alcoa Technical Center, predicts process yields, particle size distributions and occluded soda levels for various flowsheet configurations of the precipitation and classification circuit. In addition to rigorous heat, material and particle population balances, the model includes mechanistic kinetic expressions for particle growth and agglomeration and semi-empirical kinetics for nucleation and attrition. The kinetic parameters have been tuned to Point Comfort's operating data, with excellent matches between the model results and plant data. The model is written for the ACSL dynamic simulation program with specifically developed input/output graphical user interfaces to provide a user-friendly tool. Features such as a seed charge controller enhance the model's usefulness for evaluating operating conditions and process control approaches.

  9. Donabedian's structure-process-outcome quality of care model: Validation in an integrated trauma system.

    PubMed

    Moore, Lynne; Lavoie, André; Bourgeois, Gilles; Lapointe, Jean

    2015-06-01

    According to Donabedian's health care quality model, improvements in the structure of care should lead to improvements in clinical processes that should in turn improve patient outcome. This model has been widely adopted by the trauma community but has not yet been validated in a trauma system. The objective of this study was to assess the performance of an integrated trauma system in terms of structure, process, and outcome and evaluate the correlation between quality domains. Quality of care was evaluated for patients treated in a Canadian provincial trauma system (2005-2010; 57 centers, n = 63,971) using quality indicators (QIs) developed and validated previously. Structural performance was measured by transposing on-site accreditation visit reports onto an evaluation grid according to American College of Surgeons criteria. The composite process QI was calculated as the average sum of proportions of conformity to 15 process QIs derived from literature review and expert opinion. Outcome performance was measured using risk-adjusted rates of mortality, complications, and readmission as well as hospital length of stay (LOS). Correlation was assessed with Pearson's correlation coefficients. Statistically significant correlations were observed between structure and process QIs (r = 0.33), and process and outcome QIs (r = -0.33 for readmission, r = -0.27 for LOS). Significant positive correlations were also observed between outcome QIs (r = 0.37 for mortality-readmission; r = 0.39 for mortality-LOS and readmission-LOS; r = 0.45 for mortality-complications; r = 0.34 for readmission-complications; 0.63 for complications-LOS). Significant correlations between quality domains observed in this study suggest that Donabedian's structure-process-outcome model is a valid model for evaluating trauma care. Trauma centers that perform well in terms of structure also tend to perform well in terms of clinical processes, which in turn has a favorable influence on patient outcomes. Prognostic study, level III.

  10. Descriptive and evaluative judgment processes: behavioral and electrophysiological indices of processing symmetry and aesthetics.

    PubMed

    Jacobsen, Thomas; Höfel, Lea

    2003-12-01

    Descriptive symmetry and evaluative aesthetic judgment processes were compared using identical stimuli in both judgment tasks. Electrophysiological activity was recorded while participants judged novel formal graphic patterns in a trial-by-trial cuing setting using binary responses (symmetric, not symmetric; beautiful, not beautiful). Judgment analyses of a Phase 1 test and main experiment performance resulted in individual models, as well as group models, of the participants' judgment systems. Symmetry showed a strong positive correlation with beautiful judgments and was the most important cue. Descriptive judgments were performed faster than evaluative judgments. The ERPs revealed a phasic, early frontal negativity for the not-beautiful judgments. A sustained posterior negativity was observed in the symmetric condition. All conditions showed late positive potentials (LPPs). Evaluative judgment LPPs revealed a more pronounced right lateralization. It is argued that the present aesthetic judgments engage a two-stage process consisting of early, anterior frontomedian impression formation after 300 msec and right-hemisphere evaluative categorization around 600 msec after onset of the graphic patterns.

  11. GEM-AQ, an On-line Global Multiscale Chemical Weather System: Model Description and Evaluation of Gas Phase Chemistry Processes

    NASA Astrophysics Data System (ADS)

    Neary, L.; Kaminski, J. W.; Struzewska, J.; Ainslie, B.; McConnell, J. C.

    2007-12-01

    Tropospheric chemistry and air quality processes were implemented on-line in the Global Environmental Multiscale model. The integrated model, GEM-AQ, has been developed as a platform to investigate chemical weather at scales from global to urban. On the global scale, the model was exercised for five years (2001-2005) to evaluate its ability to simulate seasonal variations and regional distributions of trace gases such as ozone, nitrogen dioxide and carbon monoxide. The model results are compared with observations from satellites, aircraft measurement campaigns and balloon sondes. The same model has also been evaluated on the regional (~15km resolution) and urban scale (~3km resolution). A simulation of the formation and transport of photooxidants during the European heat wave of 2006 was performed and compared with surface observations throughout central and eastern Europe. The complex topographic region of the Lower Fraser Valley in British Columbia was the focus of another model evaluation during the PACIFIC 2001 field campaign. Comparison of model results with observations during this period will be shown.

  12. Developing Statistical Models to Assess Transplant Outcomes Using National Registries: The Process in the United States.

    PubMed

    Snyder, Jon J; Salkowski, Nicholas; Kim, S Joseph; Zaun, David; Xiong, Hui; Israni, Ajay K; Kasiske, Bertram L

    2016-02-01

    Created by the US National Organ Transplant Act in 1984, the Scientific Registry of Transplant Recipients (SRTR) is obligated to publicly report data on transplant program and organ procurement organization performance in the United States. These reports include risk-adjusted assessments of graft and patient survival, and programs performing worse or better than expected are identified. The SRTR currently maintains 43 risk adjustment models for assessing posttransplant patient and graft survival and, in collaboration with the SRTR Technical Advisory Committee, has developed and implemented a new systematic process for model evaluation and revision. Patient cohorts for the risk adjustment models are identified, and single-organ and multiorgan transplants are defined, then each risk adjustment model is developed following a prespecified set of steps. Model performance is assessed, the model is refit to a more recent cohort before each evaluation cycle, and then it is applied to the evaluation cohort. The field of solid organ transplantation is unique in the breadth of the standardized data that are collected. These data allow for quality assessment across all transplant providers in the United States. A standardized process of risk model development using data from national registries may enhance the field.

  13. Switching and optimizing control for coal flotation process based on a hybrid model

    PubMed Central

    Dong, Zhiyong; Wang, Ranfeng; Fan, Minqiang; Fu, Xiang

    2017-01-01

    Flotation is an important part of coal preparation, and the flotation column is widely applied as efficient flotation equipment. This process is complex and affected by many factors, with the froth depth and reagent dosage being two of the most important and frequently manipulated variables. This paper proposes a new method of switching and optimizing control for the coal flotation process. A hybrid model is built and evaluated using industrial data. First, wavelet analysis and principal component analysis (PCA) are applied for signal pre-processing. Second, a control model for optimizing the set point of the froth depth is constructed based on fuzzy control, and a control model is designed to optimize the reagent dosages based on expert system. Finally, the least squares-support vector machine (LS-SVM) is used to identify the operating conditions of the flotation process and to select one of the two models (froth depth or reagent dosage) for subsequent operation according to the condition parameters. The hybrid model is developed and evaluated on an industrial coal flotation column and exhibits satisfactory performance. PMID:29040305

  14. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  15. Process-level model evaluation: a snow and heat transfer metric

    NASA Astrophysics Data System (ADS)

    Slater, Andrew G.; Lawrence, David M.; Koven, Charles D.

    2017-04-01

    Land models require evaluation in order to understand results and guide future development. Examining functional relationships between model variables can provide insight into the ability of models to capture fundamental processes and aid in minimizing uncertainties or deficiencies in model forcing. This study quantifies the proficiency of land models to appropriately transfer heat from the soil through a snowpack to the atmosphere during the cooling season (Northern Hemisphere: October-March). Using the basic physics of heat diffusion, we investigate the relationship between seasonal amplitudes of soil versus air temperatures due to insulation from seasonal snow. Observations demonstrate the anticipated exponential relationship of attenuated soil temperature amplitude with increasing snow depth and indicate that the marginal influence of snow insulation diminishes beyond an effective snow depth of about 50 cm. A snow and heat transfer metric (SHTM) is developed to quantify model skill compared to observations. Land models within the CMIP5 experiment vary widely in SHTM scores, and deficiencies can often be traced to model structural weaknesses. The SHTM value for individual models is stable over 150 years of climate, 1850-2005, indicating that the metric is insensitive to climate forcing and can be used to evaluate each model's representation of the insulation process.

  16. Process-level model evaluation: a snow and heat transfer metric

    DOE PAGES

    Slater, Andrew G.; Lawrence, David M.; Koven, Charles D.

    2017-04-20

    Land models require evaluation in order to understand results and guide future development. Examining functional relationships between model variables can provide insight into the ability of models to capture fundamental processes and aid in minimizing uncertainties or deficiencies in model forcing. This study quantifies the proficiency of land models to appropriately transfer heat from the soil through a snowpack to the atmosphere during the cooling season (Northern Hemisphere: October–March). Using the basic physics of heat diffusion, we investigate the relationship between seasonal amplitudes of soil versus air temperatures due to insulation from seasonal snow. Observations demonstrate the anticipated exponential relationshipmore » of attenuated soil temperature amplitude with increasing snow depth and indicate that the marginal influence of snow insulation diminishes beyond an effective snow depth of about 50 cm. A snow and heat transfer metric (SHTM) is developed to quantify model skill compared to observations. Land models within the CMIP5 experiment vary widely in SHTM scores, and deficiencies can often be traced to model structural weaknesses. The SHTM value for individual models is stable over 150 years of climate, 1850–2005, indicating that the metric is insensitive to climate forcing and can be used to evaluate each model's representation of the insulation process.« less

  17. Process-level model evaluation: a snow and heat transfer metric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slater, Andrew G.; Lawrence, David M.; Koven, Charles D.

    Land models require evaluation in order to understand results and guide future development. Examining functional relationships between model variables can provide insight into the ability of models to capture fundamental processes and aid in minimizing uncertainties or deficiencies in model forcing. This study quantifies the proficiency of land models to appropriately transfer heat from the soil through a snowpack to the atmosphere during the cooling season (Northern Hemisphere: October–March). Using the basic physics of heat diffusion, we investigate the relationship between seasonal amplitudes of soil versus air temperatures due to insulation from seasonal snow. Observations demonstrate the anticipated exponential relationshipmore » of attenuated soil temperature amplitude with increasing snow depth and indicate that the marginal influence of snow insulation diminishes beyond an effective snow depth of about 50 cm. A snow and heat transfer metric (SHTM) is developed to quantify model skill compared to observations. Land models within the CMIP5 experiment vary widely in SHTM scores, and deficiencies can often be traced to model structural weaknesses. The SHTM value for individual models is stable over 150 years of climate, 1850–2005, indicating that the metric is insensitive to climate forcing and can be used to evaluate each model's representation of the insulation process.« less

  18. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  19. Applying the Quadruple Process model to evaluate change in implicit attitudinal responses during therapy for panic disorder.

    PubMed

    Clerkin, Elise M; Fisher, Christopher R; Sherman, Jeffrey W; Teachman, Bethany A

    2014-01-01

    This study explored the automatic and controlled processes that may influence performance on an implicit measure across cognitive-behavioral group therapy for panic disorder. The Quadruple Process model was applied to error scores from an Implicit Association Test evaluating associations between the concepts Me (vs. Not Me) + Calm (vs. Panicked) to evaluate four distinct processes: Association Activation, Detection, Guessing, and Overcoming Bias. Parameter estimates were calculated in the panic group (n = 28) across each treatment session where the IAT was administered, and at matched times when the IAT was completed in the healthy control group (n = 31). Association Activation for Me + Calm became stronger over treatment for participants in the panic group, demonstrating that it is possible to change automatically activated associations in memory (vs. simply overriding those associations) in a clinical sample via therapy. As well, the Guessing bias toward the calm category increased over treatment for participants in the panic group. This research evaluates key tenets about the role of automatic processing in cognitive models of anxiety, and emphasizes the viability of changing the actual activation of automatic associations in the context of treatment, versus only changing a person's ability to use reflective processing to overcome biased automatic processing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Fluent, fast, and frugal? A formal model evaluation of the interplay between memory, fluency, and comparative judgments.

    PubMed

    Hilbig, Benjamin E; Erdfelder, Edgar; Pohl, Rüdiger F

    2011-07-01

    A new process model of the interplay between memory and judgment processes was recently suggested, assuming that retrieval fluency-that is, the speed with which objects are recognized-will determine inferences concerning such objects in a single-cue fashion. This aspect of the fluency heuristic, an extension of the recognition heuristic, has remained largely untested due to methodological difficulties. To overcome the latter, we propose a measurement model from the class of multinomial processing tree models that can estimate true single-cue reliance on recognition and retrieval fluency. We applied this model to aggregate and individual data from a probabilistic inference experiment and considered both goodness of fit and model complexity to evaluate different hypotheses. The results were relatively clear-cut, revealing that the fluency heuristic is an unlikely candidate for describing comparative judgments concerning recognized objects. These findings are discussed in light of a broader theoretical view on the interplay of memory and judgment processes.

  1. Investigation of Mediational Processes Using Parallel Process Latent Growth Curve Modeling.

    ERIC Educational Resources Information Center

    Cheong, JeeWon; MacKinnon, David P.; Khoo, Siek Toon

    2003-01-01

    Investigated a method to evaluate mediational processes using latent growth curve modeling and tested it with empirical data from a longitudinal steroid use prevention program focusing on 1,506 high school football players over 4 years. Findings suggest the usefulness of the approach. (SLD)

  2. The Evaluation of Hospital Performance in Iran: A Systematic Review Article

    PubMed Central

    BAHADORI, Mohammadkarim; IZADI, Ahmad Reza; GHARDASHI, Fatemeh; RAVANGARD, Ramin; HOSSEINI, Seyed Mojtaba

    2016-01-01

    Background: This research aimed to systematically study and outline the methods of hospital performance evaluation used in Iran. Methods: In this systematic review, all Persian and English-language articles published in the Iranian and non-Iranian scientific journals indexed from Sep 2004 to Sep 2014 were studied. For finding the related articles, the researchers searched the Iranian electronic databases, including SID, IranMedex, IranDoc, Magiran, as well as the non-Iranian electronic databases, including Medline, Embase, Scopus, and Google Scholar. For reviewing the selected articles, a data extraction form, developed by the researchers was used. Results: The entire review process led to the selection of 51 articles. The publication of articles on the hospital performance evaluation in Iran has increased considerably in the recent years. Besides, among these 51 articles, 38 articles (74.51%) had been published in Persian language and 13 articles (25.49%) in English language. Eight models were recognized as evaluation model for Iranian hospitals. Totally, in 15 studies, the data envelopment analysis model had been used to evaluate the hospital performance. Conclusion: Using a combination of model to integrate indicators in the hospital evaluation process is inevitable. Therefore, the Ministry of Health and Medical Education should use a set of indicators such as the balanced scorecard in the process of hospital evaluation and accreditation and encourage the hospital managers to use them. PMID:27516991

  3. The Evaluation of Hospital Performance in Iran: A Systematic Review Article.

    PubMed

    Bahadori, Mohammadkarim; Izadi, Ahmad Reza; Ghardashi, Fatemeh; Ravangard, Ramin; Hosseini, Seyed Mojtaba

    2016-07-01

    This research aimed to systematically study and outline the methods of hospital performance evaluation used in Iran. In this systematic review, all Persian and English-language articles published in the Iranian and non-Iranian scientific journals indexed from Sep 2004 to Sep 2014 were studied. For finding the related articles, the researchers searched the Iranian electronic databases, including SID, IranMedex, IranDoc, Magiran, as well as the non-Iranian electronic databases, including Medline, Embase, Scopus, and Google Scholar. For reviewing the selected articles, a data extraction form, developed by the researchers was used. The entire review process led to the selection of 51 articles. The publication of articles on the hospital performance evaluation in Iran has increased considerably in the recent years. Besides, among these 51 articles, 38 articles (74.51%) had been published in Persian language and 13 articles (25.49%) in English language. Eight models were recognized as evaluation model for Iranian hospitals. Totally, in 15 studies, the data envelopment analysis model had been used to evaluate the hospital performance. Using a combination of model to integrate indicators in the hospital evaluation process is inevitable. Therefore, the Ministry of Health and Medical Education should use a set of indicators such as the balanced scorecard in the process of hospital evaluation and accreditation and encourage the hospital managers to use them.

  4. A modeling framework for evaluating streambank stabilization practices for reach-scale sediment reduction

    USDA-ARS?s Scientific Manuscript database

    Streambank stabilization techniques are often implemented to reduce sediment loads from unstable streambanks. Process-based models can predict sediment yields with stabilization scenarios prior to implementation. However, a framework does not exist on how to effectively utilize these models to evalu...

  5. A BAYESIAN STATISTICAL APPROACHES FOR THE EVALUATION OF CMAQ

    EPA Science Inventory

    This research focuses on the application of spatial statistical techniques for the evaluation of the Community Multiscale Air Quality (CMAQ) model. The upcoming release version of the CMAQ model was run for the calendar year 2001 and is in the process of being evaluated by EPA an...

  6. Multi-Dimensional Planning/Evaluation Schema for Community Education.

    ERIC Educational Resources Information Center

    Merkel-Keller, Claudia; Herr, Audrey

    A model for planning and evaluating community education programs--Stufflebeam's context, input, process, product (CIPP) evaluation model--was described and field-tested with the community education programs in Lakewood, New Jersey. Community education was defined as a concern for everything that affects the well-being of all citizens within a…

  7. Embedding chiropractic in Indigenous Health Care Organisations: applying the normalisation process model.

    PubMed

    Polus, Barbara I; Paterson, Charlotte; van Rotterdam, Joan; Vindigni, Dein

    2012-11-26

    Improving the health of Indigenous Australians remains a major challenge. A chiropractic service was established to evaluate this treatment option for musculoskeletal illness in rural Indigenous communities, based on the philosophy of keeping the community involved in all the phases of development, implementation, and evaluation. The development and integration of this service has experienced many difficulties with referrals, funding and building sustainability. Evaluation of the program was a key aspect of its implementation, requiring an appropriate process to identify specific problems and formulate solutions to improve the service. We used the normalisation process model (May 2006) to order the data collected in consultation meetings and to inform our strategy and actions. The normalisation process model provided us with a structure for organising consultation meeting data and helped prioritise tasks. Our data was analysed as it applied to each dimension of the model, noting aspects that the model did not encompass. During this process we reworded the dimensions into more everyday terminology. The final analysis focused on to what extent the model helped us to prioritise and systematise our tasks and plans. We used the model to consider ways to promote the chiropractic service, to enhance relationships and interactions between clinicians and procedures within the health service, and to avoid disruption of the existing service. We identified ways in which chiropractors can become trusted team members who have acceptable and recognised knowledge and skills. We also developed strategies that should result in chiropractic practitioners finding a place within a complex occupational web, by being seen as similar to well-known occupations such as physiotherapy. Interestingly, one dimension identified by our data, which we have labelled 'emancipatory', was absent from the model. The normalisation process model has resulted in a number of new insights and questions. We have now established thriving weekly chiropractic clinics staffed by a team of volunteer chiropractors. We identified an 'emancipatory' dimension that requires further study. We provide a worked example of using this model to establish, integrate and evaluate a chiropractic service in an Indigenous Australian community.

  8. Computer modeling of lung cancer diagnosis-to-treatment process

    PubMed Central

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick

    2015-01-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181

  9. Evaluation of nursing practice: process and critique.

    PubMed

    Braunstein, M S

    1998-01-01

    This article describes the difficulties in conducting clinical trials to evaluate nursing practice models. Suggestions are offered for strengthening the process. A clinical trial of a nursing practice model based on a synthesis of Aristotelian theory with Rogers' science is described. The rationale for decisions regarding the research procedures used in presented. Methodological limitations of the study design and the specifications of the practice model are examined. It is concluded that clear specification of theoretical relationships within a practice model and clear identification of key intervening variables will enable researchers to better connect the treatment with the outcome.

  10. Evaluation of an urban canopy model in a tropical city: the role of tree evapotranspiration

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Li, Xian-Xiang; Harshan, Suraj; Roth, Matthias; Velasco, Erik

    2017-09-01

    A single layer urban canopy model (SLUCM) with enhanced hydrologic processes, is evaluated in a tropical city, Singapore. The evaluation was performed using an 11 month offline simulation with the coupled Noah land surface model/SLUCM over a compact low-rise residential area. Various hydrological processes are considered, including anthropogenic latent heat release, and evaporation from impervious urban facets. Results show that the prediction of energy fluxes, in particular latent heat flux, is improved when these processes were included. However, the simulated latent heat flux is still underestimated by ∼40%. Considering Singapore’s high green cover ratio, the tree evapotranspiration process is introduced into the model, which significantly improves the simulated latent heat flux. In particular, the systematic error of the model is greatly reduced, and becomes lower than the unsystematic error in some seasons. The effect of tree evapotranspiration on the urban surface energy balance is further demonstrated during an unusual dry spell. The present study demonstrates that even at sites with relatively low (11%) tree coverage, ignoring evapotranspiration from trees may cause serious underestimation of the latent heat flux and atmospheric humidity. The improved model is also transferable to other tropical or temperate regions to study the impact of tree evapotranspiration on urban climate.

  11. Evaluating climate model performance in the tropics with retrievals of water isotopic composition from Aura TES

    NASA Astrophysics Data System (ADS)

    Field, Robert; Kim, Daehyun; Kelley, Max; LeGrande, Allegra; Worden, John; Schmidt, Gavin

    2014-05-01

    Observational and theoretical arguments suggest that satellite retrievals of the stable isotope composition of water vapor could be useful for climate model evaluation. The isotopic composition of water vapor is controlled by the same processes that control water vapor amount, but the observed distribution of isotopic composition is distinct from amount itself . This is due to the fractionation that occurs between the abundant H216O isotopes (isotopologues) and the rare and heavy H218O and HDO isotopes during evaporation and condensation. The fractionation physics are much simpler than the underlying moist physics; discrepancies between observed and modeled isotopic fields are more likely due to problems in the latter. Isotopic measurements therefore have the potential for identifying problems that might not be apparent from more conventional measurements. Isotopic tracers have existed in climate models since the 1980s but it is only since the mid 2000s that there have been enough data for meaningful model evaluation in this sense, in the troposphere at least. We have evaluated the NASA GISS ModelE2 general circulation model over the tropics against water isotope (HDO/H2O) retrievals from the Aura Tropospheric Emission Spectrometer (TES), alongside more conventional measurements. A small ensemble of experiments was performed with physics perturbations to the cumulus and planetary boundary layer schemes, done in the context of the normal model development process. We examined the degree to which model-data agreement could be used to constrain a select group of internal processes in the model, namely condensate evaporation, entrainment strength, and moist convective air mass flux. All are difficult to parameterize, but exert strong influence over model performance. We found that the water isotope composition was significantly more sensitive to physics changes than precipitation, temperature or relative humidity through the depth of the tropical troposphere. Among the processes considered, this was most closely, and fairly exclusively, related to mid-tropospheric entrainment strength. This demonstrates that water isotope retrievals have considerable potential alongside more conventional measurements for climate model evaluation and development.

  12. Evaluating the Quality of the Learning Outcome in Healthcare Sector: The Expero4care Model

    ERIC Educational Resources Information Center

    Cervai, Sara; Polo, Federica

    2015-01-01

    Purpose: This paper aims to present the Expero4care model. Considering the growing need for a training evaluation model that does not simply fix processes, the Expero4care model represents the first attempt of a "quality model" dedicated to the learning outcomes of healthcare trainings. Design/Methodology/Approach: Created as development…

  13. Using logic models in a community-based agricultural injury prevention project.

    PubMed

    Helitzer, Deborah; Willging, Cathleen; Hathorn, Gary; Benally, Jeannie

    2009-01-01

    The National Institute for Occupational Safety and Health has long promoted the logic model as a useful tool in an evaluator's portfolio. Because a logic model supports a systematic approach to designing interventions, it is equally useful for program planners. Undertaken with community stakeholders, a logic model process articulates the underlying foundations of a particular programmatic effort and enhances program design and evaluation. Most often presented as sequenced diagrams or flow charts, logic models demonstrate relationships among the following components: statement of a problem, various causal and mitigating factors related to that problem, available resources to address the problem, theoretical foundations of the selected intervention, intervention goals and planned activities, and anticipated short- and long-term outcomes. This article describes a case example of how a logic model process was used to help community stakeholders on the Navajo Nation conceive, design, implement, and evaluate agricultural injury prevention projects.

  14. Identfying the Needs of Pre-Service Classroom Teachers about Science Teaching Methodology Course in Terms of Parlett's Illuminative Program Evaluation Model

    ERIC Educational Resources Information Center

    Çaliskan, Ilke

    2014-01-01

    The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…

  15. Identfying the Needs of Pre-Service Classroom Teachers about Science Teaching Methodology Courses in Terms of Parlett's Illuminative Program Evaluation Model

    ERIC Educational Resources Information Center

    Çaliskan, Ilke

    2014-01-01

    The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…

  16. Carbon Cycle Model Linkage Project (CCMLP): Evaluating Biogeochemical Process Models with Atmospheric Measurements and Field Experiments

    NASA Astrophysics Data System (ADS)

    Heimann, M.; Prentice, I. C.; Foley, J.; Hickler, T.; Kicklighter, D. W.; McGuire, A. D.; Melillo, J. M.; Ramankutty, N.; Sitch, S.

    2001-12-01

    Models of biophysical and biogeochemical proceses are being used -either offline or in coupled climate-carbon cycle (C4) models-to assess climate- and CO2-induced feedbacks on atmospheric CO2. Observations of atmospheric CO2 concentration, and supplementary tracers including O2 concentrations and isotopes, offer unique opportunities to evaluate the large-scale behaviour of models. Global patterns, temporal trends, and interannual variability of the atmospheric CO2 concentration and its seasonal cycle provide crucial benchmarks for simulations of regionally-integrated net ecosystem exchange; flux measurements by eddy correlation allow a far more demanding model test at the ecosystem scale than conventional indicators, such as measurements of annual net primary production; and large-scale manipulations, such as the Duke Forest Free Air Carbon Enrichment (FACE) experiment, give a standard to evaluate modelled phenomena such as ecosystem-level CO2 fertilization. Model runs including historical changes of CO2, climate and land use allow comparison with regional-scale monthly CO2 balances as inferred from atmospheric measurements. Such comparisons are providing grounds for some confidence in current models, while pointing to processes that may still be inadequately treated. Current plans focus on (1) continued benchmarking of land process models against flux measurements across ecosystems and experimental findings on the ecosystem-level effects of enhanced CO2, reactive N inputs and temperature; (2) improved representation of land use, forest management and crop metabolism in models; and (3) a strategy for the evaluation of C4 models in a historical observational context.

  17. Detection and quantification of flow consistency in business process models.

    PubMed

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  18. Evaluating models of population process in a threatened population of Steller’s eiders: A retrospective approach

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-10-11

    The Alaskan breeding population of Steller’s eiders (Polysticta stelleri) was listed as threatened under the Endangered Species Act in 1997 in response to perceived declines in abundance throughout their breeding and nesting range. Aerial surveys suggest the breeding population is small and highly variable in number, with zero birds counted in 5 of the last 25 years. Research was conducted to evaluate competing population process models of Alaskan-breeding Steller’s eiders through comparison of model projections to aerial survey data. To evaluate model efficacy and estimate demographic parameters, a Bayesian state-space modeling framework was used and each model was fit to counts from the annual aerial surveys, using sequential importance sampling and resampling. The results strongly support that the Alaskan breeding population experiences population level nonbreeding events and is open to exchange with the larger Russian-Pacific breeding population. Current recovery criteria for the Alaskan breeding population rely heavily on the ability to estimate population viability. The results of this investigation provide an informative model of the population process that can be used to examine future population states and assess the population in terms of the current recovery and reclassification criteria.

  19. Exploring the Suitability of an English for Health Sciences Program: Model and Report of a Self-Evaluation Process

    ERIC Educational Resources Information Center

    Ferrer, Erica; Pérez, Yuddy

    2017-01-01

    Program evaluation is a process of carefully collecting information in order to make informed decisions to strengthen specific components of a given program. The type of evaluation an institution decides to undertake depends on the purpose as well as on the information the institution wants to find out about its program. Self-evaluation represents…

  20. Availability Control for Means of Transport in Decisive Semi-Markov Models of Exploitation Process

    NASA Astrophysics Data System (ADS)

    Migawa, Klaudiusz

    2012-12-01

    The issues presented in this research paper refer to problems connected with the control process for exploitation implemented in the complex systems of exploitation for technical objects. The article presents the description of the method concerning the control availability for technical objects (means of transport) on the basis of the mathematical model of the exploitation process with the implementation of the decisive processes by semi-Markov. The presented method means focused on the preparing the decisive for the exploitation process for technical objects (semi-Markov model) and after that specifying the best control strategy (optimal strategy) from among possible decisive variants in accordance with the approved criterion (criteria) of the activity evaluation of the system of exploitation for technical objects. In the presented method specifying the optimal strategy for control availability in the technical objects means a choice of a sequence of control decisions made in individual states of modelled exploitation process for which the function being a criterion of evaluation reaches the extreme value. In order to choose the optimal control strategy the implementation of the genetic algorithm was chosen. The opinions were presented on the example of the exploitation process of the means of transport implemented in the real system of the bus municipal transport. The model of the exploitation process for the means of transports was prepared on the basis of the results implemented in the real transport system. The mathematical model of the exploitation process was built taking into consideration the fact that the model of the process constitutes the homogenous semi-Markov process.

  1. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    USGS Publications Warehouse

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con

  2. Reverse Osmosis Processing of Organic Model Compounds and Fermentation Broths

    DTIC Science & Technology

    2006-04-01

    AFRL-ML-TY-TP-2007-4545 POSTPRINT REVERSE OSMOSIS PROCESSING OF ORGANIC MODEL COMPOUNDS AND FERMENTATION BROTHS Robert Diltz...TELEPHONE NUMBER (Include area code) Bioresource Technology 98 (2007) 686–695Reverse osmosis processing of organic model compounds and fermentation broths...December 2005; accepted 31 January 2006 Available online 4 April 2006Abstract Post-treatment of an anaerobic fermentation broth was evaluated using a 150

  3. An Expedient Study on Back-Propagation (BPN) Neural Networks for Modeling Automated Evaluation of the Answers and Progress of Deaf Students' That Possess Basic Knowledge of the English Language and Computer Skills

    NASA Astrophysics Data System (ADS)

    Vrettaros, John; Vouros, George; Drigas, Athanasios S.

    This article studies the expediency of using neural networks technology and the development of back-propagation networks (BPN) models for modeling automated evaluation of the answers and progress of deaf students' that possess basic knowledge of the English language and computer skills, within a virtual e-learning environment. The performance of the developed neural models is evaluated with the correlation factor between the neural networks' response values and the real value data as well as the percentage measurement of the error between the neural networks' estimate values and the real value data during its training process and afterwards with unknown data that weren't used in the training process.

  4. EVALUATING REGIONAL PREDICTIVE CAPACITY OF A PROCESS-BASED MERCURY EXPOSURE MODEL, REGIONAL-MERCURY CYCLING MODEL (R-MCM), APPLIED TO 91 VERMONT AND NEW HAMPSHIRE LAKES AND PONDS, USA

    EPA Science Inventory

    Regulatory agencies must develop fish consumption advisories for many lakes and rivers with limited resources. Process-based mathematical models are potentially valuable tools for developing regional fish advisories. The Regional Mercury Cycling model (R-MCM) was specifically d...

  5. Engaging partners to initiate evaluation efforts: tactics used and lessons learned from the prevention research centers program.

    PubMed

    Wright, Demia Sundra; Anderson, Lynda A; Brownson, Ross C; Gwaltney, Margaret K; Scherer, Jennifer; Cross, Alan W; Goodman, Robert M; Schwartz, Randy; Sims, Tom; White, Carol R

    2008-01-01

    The Centers for Disease Control and Prevention's (CDC's) Prevention Research Centers (PRC) Program underwent a 2-year evaluation planning project using a participatory process that allowed perspectives from the national community of PRC partners to be expressed and reflected in a national logic model. The PRC Program recognized the challenge in developing a feasible, useable, and relevant evaluation process for a large, diverse program. To address the challenge, participatory and utilization-focused evaluation models were used. Four tactics guided the evaluation planning process: 1) assessing stakeholders' communication needs and existing communication mechanisms and infrastructure; 2) using existing mechanisms and establishing others as needed to inform, educate, and request feedback; 3) listening to and using feedback received; and 4) obtaining adequate resources and building flexibility into the project plan to support multifaceted mechanisms for data collection. Participatory methods resulted in buy-in from stakeholders and the development of a national logic model. Benefits included CDC's use of the logic model for program planning and development of a national evaluation protocol and increased expectations among PRC partners for involvement. Challenges included the time, effort, and investment of program resources required for the participatory approach and the identification of whom to engage and when to engage them for feedback on project decisions. By using a participatory and utilization-focused model, program partners positively influenced how CDC developed an evaluation plan. The tactics we used can guide the involvement of program stakeholders and help with decisions on appropriate methods and approaches for engaging partners.

  6. Evaluating Organic Aerosol Model Performance: Impact of two Embedded Assumptions

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Giroux, E.; Roth, H.; Yin, D.

    2004-05-01

    Organic aerosols are important due to their abundance in the polluted lower atmosphere and their impact on human health and vegetation. However, modeling organic aerosols is a very challenging task because of the complexity of aerosol composition, structure, and formation processes. Assumptions and their associated uncertainties in both models and measurement data make model performance evaluation a truly demanding job. Although some assumptions are obvious, others are hidden and embedded, and can significantly impact modeling results, possibly even changing conclusions about model performance. This paper focuses on analyzing the impact of two embedded assumptions on evaluation of organic aerosol model performance. One assumption is about the enthalpy of vaporization widely used in various secondary organic aerosol (SOA) algorithms. The other is about the conversion factor used to obtain ambient organic aerosol concentrations from measured organic carbon. These two assumptions reflect uncertainties in the model and in the ambient measurement data, respectively. For illustration purposes, various choices of the assumed values are implemented in the evaluation process for an air quality model based on CMAQ (the Community Multiscale Air Quality Model). Model simulations are conducted for the Lower Fraser Valley covering Southwest British Columbia, Canada, and Northwest Washington, United States, for a historical pollution episode in 1993. To understand the impact of the assumed enthalpy of vaporization on modeling results, its impact on instantaneous organic aerosol yields (IAY) through partitioning coefficients is analysed first. The analysis shows that utilizing different enthalpy of vaporization values causes changes in the shapes of IAY curves and in the response of SOA formation capability of reactive organic gases to temperature variations. These changes are then carried into the air quality model and cause substantial changes in the organic aerosol modeling results. In another aspect, using different assumed factors to convert measured organic carbon to organic aerosol concentrations cause substantial variations in the processed ambient data themselves, which are normally used as performance targets for model evaluations. The combination of uncertainties in the modeling results and in the moving performance targets causes major uncertainties in the final conclusion about the model performance. Without further information, the best thing that a modeler can do is to choose a combination of the assumed values from the sensible parameter ranges available in the literature, based on the best match of the modeling results with the processed measurement data. However, the best match of the modeling results with the processed measurement data may not necessarily guarantee that the model itself is rigorous and the model performance is robust. Conclusions on the model performance can only be reached with sufficient understanding of the uncertainties and their impact.

  7. Chemical kinetic and photochemical data for use in stratospheric modeling evaluation number 4: NASA panel for data evaluation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Evaluated sets of rate constants and photochemical cross sections compiled by the Panel are presented. The primary application of the data is in the modelling of stratospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  8. The Design and Evaluation of Teaching Experiments in Computer Science.

    ERIC Educational Resources Information Center

    Forcheri, Paola; Molfino, Maria Teresa

    1992-01-01

    Describes a relational model that was developed to provide a framework for the design and evaluation of teaching experiments for the introduction of computer science in secondary schools in Italy. Teacher training is discussed, instructional materials are considered, and use of the model for the evaluation process is described. (eight references)…

  9. Issues in Evaluation. Symposium 11. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This document contains three papers on issues in evaluation. "Evaluation of the Method of Modeling: A Case Study of the Finnish Steel Industry" (Ville Nurmi) describes the method of modeling as an educational strategy to support both specific goal-directed transformative learning focused on work process and learning in workplaces, and it…

  10. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  11. The computational modeling of supercritical carbon dioxide flow in solid wood material

    NASA Astrophysics Data System (ADS)

    Gething, Brad Allen

    The use of supercritical carbon dioxide (SC CO2) as a solvent to deliver chemicals to porous media has shown promise in various industries. Recently, efforts by the wood treating industry have been made to use SC CO 2 as a replacement to more traditional methods of chemical preservative delivery. Previous studies have shown that the SC CO2 pressure treatment process is capable of impregnating solid wood materials with chemical preservatives, but concentration gradients of preservative often develop during treatment. Widespread application of the treatment process is unlikely unless the treatment inconsistencies can be improved for greater overall treating homogeneity. The development of a computational flow model to accurately predict the internal pressure of CO2 during treatment is integral to a more consistent treatment process. While similar models that attempt to describe the flow process have been proposed by Ward (1989) and Sahle-Demessie (1994), neither have been evaluated for accuracy. The present study was an evaluation of those models. More specifically, the present study evaluated the performance of a computational flow model, which was based on the viscous flow of compressible CO2 as a single phase through a porous medium at the macroscopic scale. Flow model performance was evaluated through comparisons between predicted pressures that corresponded to internal pressure development measured with inserted sensor probes during treatment of specimens. Pressure measurements were applied through a technique developed by Schneider (2000), which utilizes epoxy-sealed stainless steel tubes that are inserted into the wood as pressure probes. Two different wood species were investigated as treating specimens, Douglas-fir and shortleaf pine. Evaluations of the computational flow model revealed that it is sensitive to input parameters that relate to both processing conditions and material properties, particularly treating temperature and wood permeability, respectively. This sensitivity requires that the input parameters, principally permeability, be relatively accurate to evaluate the appropriateness of the phenomenological relationships of the computational flow model. Providing this stipulation, it was observed that below the region of transition from CO2 gas to supercritical fluid, the computational flow model has the potential to predict flow accurately. However, above the transition region, the model does not fully account for the physics of the flow process, resulting in prediction inaccuracy. One potential cause for the loss of prediction accuracy in the supercritical region was attributed to a dynamic change in permeability that is likely caused by an interaction between the flowing SC CO2 and the wood material. Furthermore, a hysteresis was observed between the pressurization and depressurization stages of treatment, which cannot be explained by the current flow model. If greater accuracy in the computational flow model is desired, a more complex approach to the model is necessary, which would include non-constant input parameters of temperature and permeability. Furthermore, the implications of a multi-scale methodology for the flow model were explored from a qualitative standpoint.

  12. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  13. Mathematical modelling of disintegration-limited co-digestion of OFMSW and sewage sludge.

    PubMed

    Esposito, G; Frunzo, L; Panico, A; d'Antonio, G

    2008-01-01

    This paper presents a mathematical model able to simulate under dynamic conditions the physical, chemical and biological processes prevailing in a OFMSW and sewage sludge anaerobic digestion system. The model proposed is based on differential mass balance equations for substrates, products and bacterial groups involved in the co-digestion process and includes the biochemical reactions of the substrate conversion and the kinetics of microbial growth and decay. The main peculiarity of the model is the surface based kinetic description of the OFMSW disintegration process, whereas the pH determination is based on a nine-order polynomial equation derived by acid-base equilibria. The model can be applied to simulate the co-digestion process for several purposes, such as the evaluation of the optimal process conditions in terms of OFMSW/sewage sludge ratio, temperature, OFMSW particle size, solid mixture retention time, reactor stirring rate, etc. Biogas production and composition can also be evaluated to estimate the potential energy production under different process conditions. In particular, model simulations reported in this paper show the model capability to predict the OFMSW amount which can be treated in the digester of an existing MWWTP and to assess the OFMSW particle size diminution pre-treatment required to increase the rate of the disintegration process, which otherwise can highly limit the co-digestion system. Copyright IWA Publishing 2008.

  14. Modeling particulate matter emissions during mineral loading process under weak wind simulation.

    PubMed

    Zhang, Xiaochun; Chen, Weiping; Ma, Chun; Zhan, Shuifen

    2013-04-01

    The quantification of particulate matter emissions from mineral handling is an important problem for the quantification of global emissions on industrial sites. Mineral particulate matter emissions could adversely impact environmental quality in mining regions, transport regions, and even on a global scale. Mineral loading is an important process contributing to mineral particulate matter emissions, especially under weak wind conditions. Mathematical models are effective ways to evaluate particulate matter emissions during the mineral loading process. The currently used empirical models based on the form of a power function do not predict particulate matter emissions accurately under weak wind conditions. At low particulate matter emissions, the models overestimated, and at high particulate matter emissions, the models underestimated emission factors. We conducted wind tunnel experiments to evaluate the particulate matter emission factors for the mineral loading process. A new approach based on the mathematical form of a logistical function was developed and tested. It provided a realistic depiction of the particulate matter emissions during the mineral loading process, accounting for fractions of fine mineral particles, dropping height, and wind velocity. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Evaluation of the mathematical and economic basis for conversion processes in the LEAP energy-economy model

    NASA Astrophysics Data System (ADS)

    Oblow, E. M.

    1982-10-01

    An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.

  16. Phased models for evaluating the performability of computing systems

    NASA Technical Reports Server (NTRS)

    Wu, L. T.; Meyer, J. F.

    1979-01-01

    A phase-by-phase modelling technique is introduced to evaluate a fault tolerant system's ability to execute different sets of computational tasks during different phases of the control process. Intraphase processes are allowed to differ from phase to phase. The probabilities of interphase state transitions are specified by interphase transition matrices. Based on constraints imposed on the intraphase and interphase transition probabilities, various iterative solution methods are developed for calculating system performability.

  17. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    NASA Astrophysics Data System (ADS)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  18. The challenges of modelling phosphorus in a headwater catchment: Applying a 'limits of acceptability' uncertainty framework to a water quality model

    NASA Astrophysics Data System (ADS)

    Hollaway, M. J.; Beven, K. J.; Benskin, C. McW. H.; Collins, A. L.; Evans, R.; Falloon, P. D.; Forber, K. J.; Hiscock, K. M.; Kahana, R.; Macleod, C. J. A.; Ockenden, M. C.; Villamizar, M. L.; Wearing, C.; Withers, P. J. A.; Zhou, J. G.; Barber, N. J.; Haygarth, P. M.

    2018-03-01

    There is a need to model and predict the transfer of phosphorus (P) from land to water, but this is challenging because of the large number of complex physical and biogeochemical processes involved. This study presents, for the first time, a 'limits of acceptability' approach of the Generalized Likelihood Uncertainty Estimation (GLUE) framework to the Soil and Water Assessment Tool (SWAT), in an application to a water quality problem in the Newby Beck catchment (12.5 km2), Cumbria, United Kingdom (UK). Using high frequency outlet data (discharge and P), individual evaluation criteria (limits of acceptability) were assigned to observed discharge and P loads for all evaluation time steps, identifying where the model was performing well/poorly and to infer which processes required improvement in the model structure. Initial limits of acceptability were required to be relaxed by a substantial amount (by factors of between 5.3 and 6.7 on a normalized scale depending on the evaluation criteria used) in order to gain a set of behavioral simulations (1001 and 1016, respectively out of 5,000,000). Of the 39 model parameters tested, the representation of subsurface processes and associated parameters, were consistently shown as critical to the model not meeting the evaluation criteria, irrespective of the chosen evaluation metric. It is therefore concluded that SWAT is not an appropriate model to guide P management in this catchment. This approach highlights the importance of high frequency monitoring data for setting robust model evaluation criteria. It also raises the question as to whether it is possible to have sufficient input data available to drive such models so that we can have confidence in their predictions and their ability to inform catchment management strategies to tackle the problem of diffuse pollution from agriculture.

  19. Notification: Evaluation of EPA’s Approval Process for Air Quality Dispersion Models

    EPA Pesticide Factsheets

    Project #OPE-FY17-0016, June 5, 2017. The EPA OIG plans to begin preliminary research to assess the effectiveness of EPA's process for reviewing and approving air quality dispersion models it recommends for use.

  20. A Regional Climate Model Evaluation System based on Satellite and other Observations

    NASA Astrophysics Data System (ADS)

    Lean, P.; Kim, J.; Waliser, D. E.; Hall, A. D.; Mattmann, C. A.; Granger, S. L.; Case, K.; Goodale, C.; Hart, A.; Zimdars, P.; Guan, B.; Molotch, N. P.; Kaki, S.

    2010-12-01

    Regional climate models are a fundamental tool needed for downscaling global climate simulations and projections, such as those contributing to the Coupled Model Intercomparison Projects (CMIPs) that form the basis of the IPCC Assessment Reports. The regional modeling process provides the means to accommodate higher resolution and a greater complexity of Earth System processes. Evaluation of both the global and regional climate models against observations is essential to identify model weaknesses and to direct future model development efforts focused on reducing the uncertainty associated with climate projections. However, the lack of reliable observational data and the lack of formal tools are among the serious limitations to addressing these objectives. Recent satellite observations are particularly useful as they provide a wealth of information on many different aspects of the climate system, but due to their large volume and the difficulties associated with accessing and using the data, these datasets have been generally underutilized in model evaluation studies. Recognizing this problem, NASA JPL / UCLA is developing a model evaluation system to help make satellite observations, in conjunction with in-situ, assimilated, and reanalysis datasets, more readily accessible to the modeling community. The system includes a central database to store multiple datasets in a common format and codes for calculating predefined statistical metrics to assess model performance. This allows the time taken to compare model simulations with satellite observations to be reduced from weeks to days. Early results from the use this new model evaluation system for evaluating regional climate simulations over California/western US regions will be presented.

  1. Motivational Forces in a Growth-Centered Model of Teacher Evaluation

    ERIC Educational Resources Information Center

    Bruski, Nicholas Aron

    2012-01-01

    This paper presents the results of a study that explored the effects of using an action research process to examine and develop a system of teacher evaluation that leads to real changes in teacher behaviors. The study explored motivational forces and psychological processes related to the change process in adult behaviors. Data were collected by…

  2. A data collection and processing procedure for evaluating a research program

    Treesearch

    Giuseppe Rensi; H. Dean Claxton

    1972-01-01

    A set of computer programs compiled for the information processing requirements of a model for evaluating research proposals are described. The programs serve to assemble and store information, periodically update it, and convert it to a form usable for decision-making. Guides for collecting and coding data are explained. The data-processing options available and...

  3. Evaluation of soil erosion risk using Analytic Network Process and GIS: a case study from Spanish mountain olive plantations.

    PubMed

    Nekhay, Olexandr; Arriaza, Manuel; Boerboom, Luc

    2009-07-01

    The study presents an approach that combined objective information such as sampling or experimental data with subjective information such as expert opinions. This combined approach was based on the Analytic Network Process method. It was applied to evaluate soil erosion risk and overcomes one of the drawbacks of USLE/RUSLE soil erosion models, namely that they do not consider interactions among soil erosion factors. Another advantage of this method is that it can be used if there are insufficient experimental data. The lack of experimental data can be compensated for through the use of expert evaluations. As an example of the proposed approach, the risk of soil erosion was evaluated in olive groves in Southern Spain, showing the potential of the ANP method for modelling a complex physical process like soil erosion.

  4. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  5. Historical Climate Change Impacts on the Hydrological Processes of the Ponto-Caspian Basin

    NASA Astrophysics Data System (ADS)

    Koriche, Sifan A.; Singarayer, Joy S.; Coe, Michael T.; Nandini, Sri; Prange, Matthias; Cloke, Hannah; Lunt, Dan

    2017-04-01

    The Ponto-Caspian basin is one of the largest basins globally, composed of a closed basin (Caspian Sea) and open basins connecting to the global ocean (Black and Azov Sea). Over the historical time period (1850-present) Caspian Sea levels have varied between -25 and -29mbsl (Arpe et al., 2012), resulting in considerable changes to the area of the lake (currently 371,000 km2). Given projections of future climate change and the importance of the Caspian Sea for fisheries, agriculture, and industry, it is vital to understand how sea levels may vary in the future. Hydrological models can be used to assess the impacts of climate change on hydrological processes for future forecasts. However, it is critical to first evaluate such models using observational data for the present and recent past, and to understand the key hydrological processes driving past changes in sea level. In this study, the Terrestrial Hydrological Model (THMB) (Coe, 2000, 2002) is applied and evaluated to investigate the hydrological processes of the Ponto-Caspian basin for the historical period 1900 to 2000. The model has been forced using observational reanalysis datasets (ERA-Interim, ERA-20) and historical climate model data outputs (from CESM and HadCM3 models) to investigate the variability in the Caspian Sea level and the major river discharges. We examine the differences produced by driving the hydrological model with reanalysis data or climate models. We evaluate the model performance compared to observational discharge measurements and Caspian Sea level data. Secondly, we investigated the sensitivity of historical Caspian Sea level variations to different aspects of climate changes to examine the most important processes involved over this time period.

  6. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  7. Process evaluation of the Enabling Mothers toPrevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial.

    PubMed

    Knowlden, Adam P; Sharma, Manoj

    2014-09-01

    Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.

  8. An Evaluation of Understandability of Patient Journey Models in Mental Health

    PubMed Central

    2016-01-01

    Background There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. Objectives This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Method Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. Results The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. Conclusions The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers. PMID:27471006

  9. Experimental Evaluation of a Serious Game for Teaching Software Process Modeling

    ERIC Educational Resources Information Center

    Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz

    2015-01-01

    Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…

  10. Evaluation of the Triple Code Model of numerical processing-Reviewing past neuroimaging and clinical findings.

    PubMed

    Siemann, Julia; Petermann, Franz

    2018-01-01

    This review reconciles past findings on numerical processing with key assumptions of the most predominant model of arithmetic in the literature, the Triple Code Model (TCM). This is implemented by reporting diverse findings in the literature ranging from behavioral studies on basic arithmetic operations over neuroimaging studies on numerical processing to developmental studies concerned with arithmetic acquisition, with a special focus on developmental dyscalculia (DD). We evaluate whether these studies corroborate the model and discuss possible reasons for contradictory findings. A separate section is dedicated to the transfer of TCM to arithmetic development and to alternative accounts focusing on developmental questions of numerical processing. We conclude with recommendations for future directions of arithmetic research, raising questions that require answers in models of healthy as well as abnormal mathematical development. This review assesses the leading model in the field of arithmetic processing (Triple Code Model) by presenting knowledge from interdisciplinary research. It assesses the observed contradictory findings and integrates the resulting opposing viewpoints. The focus is on the development of arithmetic expertise as well as abnormal mathematical development. The original aspect of this article is that it points to a gap in research on these topics and provides possible solutions for future models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Combining Mechanistic Approaches for Studying Eco-Hydro-Geomorphic Coupling

    NASA Astrophysics Data System (ADS)

    Francipane, A.; Ivanov, V.; Akutina, Y.; Noto, V.; Istanbullouglu, E.

    2008-12-01

    Vegetation interacts with hydrology and geomorphic form and processes of a river basin in profound ways. Despite recent advances in hydrological modeling, the dynamic coupling between these processes is yet to be adequately captured at the basin scale to elucidate key features of process interaction and their role in the organization of vegetation and landscape morphology. In this study, we present a blueprint for integrating a geomorphic component into the physically-based, spatially distributed ecohydrological model, tRIBS- VEGGIE, which reproduces essential water and energy processes over the complex topography of a river basin and links them to the basic plant life regulatory processes. We present a preliminary design of the integrated modeling framework in which hillslope and channel erosion processes at the catchment scale, will be coupled with vegetation-hydrology dynamics. We evaluate the developed framework by applying the integrated model to Lucky Hills basin, a sub-catchment of the Walnut Gulch Experimental Watershed (Arizona). The evaluation is carried out by comparing sediment yields at the basin outlet, that follows a detailed verification of simulated land-surface energy partition, biomass dynamics, and soil moisture states.

  12. DYNAMIC EVALUATION OF REGIONAL AIR QUALITY MODELS: ASSESSING CHANGES TO O 3 STEMMING FROM CHANGES IN EMISSIONS AND METEOROLOGY

    EPA Science Inventory

    Regional-scale air quality models are used to estimate the response of air pollutants to potential emission control strategies as part of the decision-making process. Traditionally, the model predicted pollutant concentrations are evaluated for the “base case” to assess a model’s...

  13. Development of state and transition model assumptions used in National Forest Plan revision

    Treesearch

    Eric B. Henderson

    2008-01-01

    State and transition models are being utilized in forest management analysis processes to evaluate assumptions about disturbances and succession. These models assume valid information about seral class successional pathways and timing. The Forest Vegetation Simulator (FVS) was used to evaluate seral class succession assumptions for the Hiawatha National Forest in...

  14. Simulating forage crop production in a northern climate with the Integrated Farm System Model

    USDA-ARS?s Scientific Manuscript database

    Whole-farm simulation models are useful tools for evaluating the effect of management practices and climate variability on the agro-environmental and economic performance of farms. A few process-based farm-scale models have been developed, but none have been evaluated in a northern region with a sho...

  15. Yield model development project implementation plan

    NASA Technical Reports Server (NTRS)

    Ambroziak, R. A.

    1982-01-01

    Tasks remaining to be completed are summarized for the following major project elements: (1) evaluation of crop yield models; (2) crop yield model research and development; (3) data acquisition processing, and storage; (4) related yield research: defining spectral and/or remote sensing data requirements; developing input for driving and testing crop growth/yield models; real time testing of wheat plant process models) and (5) project management and support.

  16. Is the Closet Door Still Closed in 2014? A CIPP Model Program Evaluation of Preservice Diversity Training Regarding LGBT Issues

    ERIC Educational Resources Information Center

    Woodruff, Joseph

    2014-01-01

    The purpose of this program evaluation was to examine the four components of the CIPP evaluation model (Context, Input, Process, and Product evaluations) in the diversity training program conceptualization and design delivered to College of Education K-12 preservice teachers at a large university in the southeastern United States (referred to in…

  17. Food-chain contamination evaluations in ecological risk assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linder, G.

    Food-chain models have become increasingly important within the ecological risk assessment process. This is the case particularly when acute effects are not readily apparent, or the contaminants of concern are not readily detoxified, have a high likelihood for partitioning into lipids, or have specific target organs or tissues that may increase their significance in evaluating their potential adverse effects. An overview of food-chain models -- conceptual, theoretical, and empirical -- will be considered through a series of papers that will focus on their application within the ecological risk assessment process. Whether a food-chain evaluation is being developed to address relativelymore » simple questions related to chronic effects of toxicants on target populations, or whether a more complex food-web model is being developed to address questions related to multiple-trophic level transfers of toxicants, the elements within the food chain contamination evaluation can be generalized to address the mechanisms of toxicant accumulation in individual organisms. This can then be incorporated into more elaborate models that consider these organismal-level processes within the context of a species life-history or community-level responses that may be associated with long-term exposures.« less

  18. Process Evaluation for Improving K12 Program Effectiveness: Case Study of a National Institutes of Health Building Interdisciplinary Research Careers in Women's Health Research Career Development Program.

    PubMed

    Raymond, Nancy C; Wyman, Jean F; Dighe, Satlaj; Harwood, Eileen M; Hang, Mikow

    2018-06-01

    Process evaluation is an important tool in quality improvement efforts. This article illustrates how a systematic and continuous evaluation process can be used to improve the quality of faculty career development programs by using the University of Minnesota's Building Interdisciplinary Research Careers in Women's Health (BIRCWH) K12 program as an exemplar. Data from a rigorous process evaluation incorporating quantitative and qualitative measurements were analyzed and reviewed by the BIRCWH program leadership on a regular basis. Examples are provided of how this evaluation model and processes were used to improve many aspects of the program, thereby improving scholar, mentor, and advisory committee members' satisfaction and scholar outcomes. A rigorous evaluation plan can increase the effectiveness and impact of a research career development plan.

  19. Modelling biological Cr(VI) reduction in aquifer microcosm column systems.

    PubMed

    Molokwane, Pulane E; Chirwa, Evans M N

    2013-01-01

    Several chrome processing facilities in South Africa release hexavalent chromium (Cr(VI)) into groundwater resources. Pump-and-treat remediation processes have been implemented at some of the sites but have not been successful in reducing contamination levels. The current study is aimed at developing an environmentally friendly, cost-effective and self-sustained biological method to curb the spread of chromium at the contaminated sites. An indigenous Cr(VI)-reducing mixed culture of bacteria was demonstrated to reduce high levels of Cr(VI) in laboratory samples. The effect of Cr(VI) on the removal rate was evaluated at concentrations up to 400 mg/L. Following the detailed evaluation of fundamental processes for biological Cr(VI) reduction, a predictive model for Cr(VI) breakthrough through aquifer microcosm reactors was developed. The reaction rate in batch followed non-competitive rate kinetics with a Cr(VI) inhibition threshold concentration of approximately 99 mg/L. This study evaluates the application of the kinetic parameters determined in the batch reactors to the continuous flow process. The model developed from advection-reaction rate kinetics in a porous media fitted best the effluent Cr(VI) concentration. The model was also used to elucidate the logistic nature of biomass growth in the reactor systems.

  20. Pain management: a review of organisation models with integrated processes for the management of pain in adult cancer patients.

    PubMed

    Brink-Huis, Anita; van Achterberg, Theo; Schoonhoven, Lisette

    2008-08-01

    This paper reports a review of the literature conducted to identify organisation models in cancer pain management that contain integrated care processes and describe their effectiveness. Pain is experienced by 30-50% of cancer patients receiving treatment and by 70-90% of those with advanced disease. Efforts to improve pain management have been made through the development and dissemination of clinical guidelines. Early improvements in pain management were focussed on just one or two single processes such as pain assessment and patient education. Little is known about organisational models with multiple integrated processes throughout the course of the disease trajectory and concerning all stages of the care process. Systematic review. The review involved a systematic search of the literature, published between 1986-2006. Subject-specific keywords used to describe patients, disease, pain management interventions and integrated care processes, relevant for this review were selected using the thesaurus of the databases. Institutional models, clinical pathways and consultation services are three alternative models for the integration of care processes in cancer pain management. A clinical pathway is a comprehensive institutionalisation model, whereas a pain consultation service is a 'stand-alone' model that can be integrated in a clinical pathway. Positive patient and process outcomes have been described for all three models, although the level of evidence is generally low. Evaluation of the quality of pain management must involve standardised measurements of both patient and process outcomes. We recommend the development of policies for referrals to a pain consultation service. These policies can be integrated within a clinical pathway. To evaluate the effectiveness of pain management models standardised outcome measures are needed.

  1. Self Evaluation of Organizations.

    ERIC Educational Resources Information Center

    Pooley, Richard C.

    Evaluation within human service organizations is defined in terms of accepted evaluation criteria, with reasonable expectations shown and structured into a model of systematic evaluation practice. The evaluation criteria of program effort, performance, adequacy, efficiency and process mechanisms are discussed, along with measurement information…

  2. Evaluation of a Theory of Instructional Sequences for Physics Instruction

    ERIC Educational Resources Information Center

    Wackermann, Rainer; Trendel, Georg; Fischer, Hans E.

    2010-01-01

    The background of the study is the theory of "basis models of teaching and learning", a comprehensive set of models of learning processes which includes, for example, learning through experience and problem-solving. The combined use of different models of learning processes has not been fully investigated and it is frequently not clear…

  3. Systematic iteration between model and methodology: A proposed approach to evaluating unintended consequences.

    PubMed

    Morell, Jonathan A

    2018-06-01

    This article argues that evaluators could better deal with unintended consequences if they improved their methods of systematically and methodically combining empirical data collection and model building over the life cycle of an evaluation. This process would be helpful because it can increase the timespan from when the need for a change in methodology is first suspected to the time when the new element of the methodology is operational. The article begins with an explanation of why logic models are so important in evaluation, and why the utility of models is limited if they are not continually revised based on empirical evaluation data. It sets the argument within the larger context of the value and limitations of models in the scientific enterprise. Following will be a discussion of various issues that are relevant to model development and revision. What is the relevance of complex system behavior for understanding predictable and unpredictable unintended consequences, and the methods needed to deal with them? How might understanding of unintended consequences be improved with an appreciation of generic patterns of change that are independent of any particular program or change effort? What are the social and organizational dynamics that make it rational and adaptive to design programs around single-outcome solutions to multi-dimensional problems? How does cognitive bias affect our ability to identify likely program outcomes? Why is it hard to discern change as a result of programs being embedded in multi-component, continually fluctuating, settings? The last part of the paper outlines a process for actualizing systematic iteration between model and methodology, and concludes with a set of research questions that speak to how the model/data process can be made efficient and effective. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Standard model light-by-light scattering in SANC: Analytic and numeric evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardin, D. Yu., E-mail: bardin@nu.jinr.ru; Kalinovskaya, L. V., E-mail: kalinov@nu.jinr.ru; Uglov, E. D., E-mail: corner@nu.jinr.r

    2010-11-15

    The implementation of the Standard Model process {gamma}{gamma} {yields} {gamma}{gamma} through a fermion and boson loop into the framework of SANC system and additional precomputation modules used for calculation of massive box diagrams are described. The computation of this process takes into account nonzero mass of loop particles. The covariant and helicity amplitudes for this process, some particular cases of D{sub 0} and C{sub 0} Passarino-Veltman functions, and also numerical results of corresponding SANC module evaluation are presented. Whenever possible, the results are compared with those existing in the literature.

  5. Developing, delivering and evaluating primary mental health care: the co-production of a new complex intervention.

    PubMed

    Reeve, Joanne; Cooper, Lucy; Harrington, Sean; Rosbottom, Peter; Watkins, Jane

    2016-09-06

    Health services face the challenges created by complex problems, and so need complex intervention solutions. However they also experience ongoing difficulties in translating findings from research in this area in to quality improvement changes on the ground. BounceBack was a service development innovation project which sought to examine this issue through the implementation and evaluation in a primary care setting of a novel complex intervention. The project was a collaboration between a local mental health charity, an academic unit, and GP practices. The aim was to translate the charity's model of care into practice-based evidence describing delivery and impact. Normalisation Process Theory (NPT) was used to support the implementation of the new model of primary mental health care into six GP practices. An integrated process evaluation evaluated the process and impact of care. Implementation quickly stalled as we identified problems with the described model of care when applied in a changing and variable primary care context. The team therefore switched to using the NPT framework to support the systematic identification and modification of the components of the complex intervention: including the core components that made it distinct (the consultation approach) and the variable components (organisational issues) that made it work in practice. The extra work significantly reduced the time available for outcome evaluation. However findings demonstrated moderately successful implementation of the model and a suggestion of hypothesised changes in outcomes. The BounceBack project demonstrates the development of a complex intervention from practice. It highlights the use of Normalisation Process Theory to support development, and not just implementation, of a complex intervention; and describes the use of the research process in the generation of practice-based evidence. Implications for future translational complex intervention research supporting practice change through scholarship are discussed.

  6. Chemical kinetics and photochemical data for use in stratospheric modeling: Evaluation number 11

    NASA Technical Reports Server (NTRS)

    Demore, W. B.; Sander, S. P.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.; Kolb, C. E.; Molina, M. J.

    1994-01-01

    This is the eleventh in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The primary application of the data is in the modeling of stratospheric processes, with special emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  7. Chemical Kinetics and Photochemical Data for Use in Stratospheric Modeling. Evaluation No. 12

    NASA Technical Reports Server (NTRS)

    DeMore, W. B.; Sander, S. P.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.; Kolb, C. E.; Molina, M. J.

    1997-01-01

    This is the twelfth in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The primary application of the data is in the modeling of stratospheric processes, with special emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  8. Models and Procedures for Improving the Planning, Management, and Evaluation of Cooperative Education Programs. Final Report. Volume I.

    ERIC Educational Resources Information Center

    Blaschke, Charles L.; Steiger, JoAnn

    This report of a project to design a set of training guidelines for planning, managing, and evaluating cooperative education programs describes briefly the procedures used in developing the guidelines and model; discusses the various components of the planning, management, and evaluation process; and presents guidelines and criteria for designing…

  9. What Counts is not Falling … but Landing: Strategic Analysis: An Adapted Model for Implementation Evaluation.

    PubMed

    Brousselle, Astrid

    2004-04-01

    Implementation evaluations, also called process evaluations, involve studying the development of programmes, and identifying and understanding their strengths and weaknesses. Undertaking an implementation evaluation offers insights into evaluation objectives, but does not help the researcher develop a research strategy. During the implementation analysis of the UNAIDS drug access initiative in Chile, the strategic analysis model developed by Crozier and Friedberg was used. However, a major incompatibility was noted between the procedure put forward by Crozier and Friedberg and the specific characteristics of the programme being evaluated. In this article, an adapted strategic analysis model for programme evaluation is proposed.

  10. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  11. Evaluation of ceramics for stator application: Gas turbine engine report

    NASA Technical Reports Server (NTRS)

    Trela, W.; Havstad, P. H.

    1978-01-01

    Current ceramic materials, component fabrication processes, and reliability prediction capability for ceramic stators in an automotive gas turbine engine environment are assessed. Simulated engine duty cycle testing of stators conducted at temperatures up to 1093 C is discussed. Materials evaluated are SiC and Si3N4 fabricated from two near-net-shape processes: slip casting and injection molding. Stators for durability cycle evaluation and test specimens for material property characterization, and reliability prediction model prepared to predict stator performance in the simulated engine environment are considered. The status and description of the work performed for the reliability prediction modeling, stator fabrication, material property characterization, and ceramic stator evaluation efforts are reported.

  12. An Approach to the Evaluation of Hypermedia.

    ERIC Educational Resources Information Center

    Knussen, Christina; And Others

    1991-01-01

    Discusses methods that may be applied to the evaluation of hypermedia, based on six models described by Lawton. Techniques described include observation, self-report measures, interviews, automated measures, psychometric tests, checklists and criterion-based techniques, process models, Experimentally Measuring Usability (EMU), and a naturalistic…

  13. Multi-trait, multi-breed conception rate evaluations

    USDA-ARS?s Scientific Manuscript database

    Heifer and cow conception rates (HCR and CCR) were evaluated with multi-trait, multi-breed models including crossbred cows instead of the previous single-trait, single-breed models. Fertility traits benefit from multi-trait processing because of high genetic correlations and many missing observation...

  14. Faculty Performance Management System: The Faculty Development/Evaluation System at Beaufort Technical College, 1986-1987. Revised.

    ERIC Educational Resources Information Center

    Tobias, Earole; And Others

    Designed for faculty members at Beaufort Technical College (BTC) in South Carolina, this handbook describes the college's faculty evaluation process and procedures. The first sections of the handbook explain the rationale and method for the faculty evaluation process, state the purposes and objectives of the system, and offer a model which breaks…

  15. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  16. Whittling Down the Wait Time: Exploring Models to Minimize the Delay from Initial Concern to Diagnosis and Treatment of Autism Spectrum Disorder.

    PubMed

    Gordon-Lipkin, Eliza; Foster, Jessica; Peacock, Georgina

    2016-10-01

    The process from initial concerns to diagnosis of autism spectrum disorder (ASD) can be a long and complicated process. The traditional model for evaluation and diagnosis of ASD often consists of long wait-lists and evaluations that result in a 2-year difference between the earliest signs of ASD and mean age of diagnosis. Multiple factors contribute to this diagnostic bottleneck, including time-consuming evaluations, cost of care, lack of providers, and lack of comfort of primary care providers to diagnose autism. This article explores innovative clinical models that have been implemented to address this as well as future directions and opportunities. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  18. Improving the Impact and Implementation of Disaster Education: Programs for Children Through Theory-Based Evaluation.

    PubMed

    Johnson, Victoria A; Ronan, Kevin R; Johnston, David M; Peace, Robin

    2016-11-01

    A main weakness in the evaluation of disaster education programs for children is evaluators' propensity to judge program effectiveness based on changes in children's knowledge. Few studies have articulated an explicit program theory of how children's education would achieve desired outcomes and impacts related to disaster risk reduction in households and communities. This article describes the advantages of constructing program theory models for the purpose of evaluating disaster education programs for children. Following a review of some potential frameworks for program theory development, including the logic model, the program theory matrix, and the stage step model, the article provides working examples of these frameworks. The first example is the development of a program theory matrix used in an evaluation of ShakeOut, an earthquake drill practiced in two Washington State school districts. The model illustrates a theory of action; specifically, the effectiveness of school earthquake drills in preventing injuries and deaths during disasters. The second example is the development of a stage step model used for a process evaluation of What's the Plan Stan?, a voluntary teaching resource distributed to all New Zealand primary schools for curricular integration of disaster education. The model illustrates a theory of use; specifically, expanding the reach of disaster education for children through increased promotion of the resource. The process of developing the program theory models for the purpose of evaluation planning is discussed, as well as the advantages and shortcomings of the theory-based approaches. © 2015 Society for Risk Analysis.

  19. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    NASA Astrophysics Data System (ADS)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.

  20. Evaluating treatment process redesign by applying the EFQM Excellence Model.

    PubMed

    Nabitz, Udo; Schramade, Mark; Schippers, Gerard

    2006-10-01

    To evaluate a treatment process redesign programme implementing evidence-based treatment as part of a total quality management in a Dutch addiction treatment centre. Quality management was monitored over a period of more than 10 years in an addiction treatment centre with 550 professionals. Changes are evaluated, comparing the scores on the nine criteria of the European Foundation for Quality Management (EFQM) Excellence Model before and after a major redesign of treatment processes and ISO certification. In the course of 10 years, most intake, care, and cure processes were reorganized, the support processes were restructured and ISO certified, 29 evidence-based treatment protocols were developed and implemented, and patient follow-up measuring was established to make clinical outcomes transparent. Comparing the situation before and after the changes shows that the client satisfaction scores are stable, that the evaluation by personnel and society is inconsistent, and that clinical, production, and financial outcomes are positive. The overall EFQM assessment by external assessors in 2004 shows much higher scores on the nine criteria than the assessment in 1994. Evidence-based treatment can successfully be implemented in addiction treatment centres through treatment process redesign as part of a total quality management strategy, but not all results are positive.

  1. Using Active Learning for Speeding up Calibration in Simulation Models.

    PubMed

    Cevik, Mucahit; Ergun, Mehmet Ali; Stout, Natasha K; Trentham-Dietz, Amy; Craven, Mark; Alagoz, Oguzhan

    2016-07-01

    Most cancer simulation models include unobservable parameters that determine disease onset and tumor growth. These parameters play an important role in matching key outcomes such as cancer incidence and mortality, and their values are typically estimated via a lengthy calibration procedure, which involves evaluating a large number of combinations of parameter values via simulation. The objective of this study is to demonstrate how machine learning approaches can be used to accelerate the calibration process by reducing the number of parameter combinations that are actually evaluated. Active learning is a popular machine learning method that enables a learning algorithm such as artificial neural networks to interactively choose which parameter combinations to evaluate. We developed an active learning algorithm to expedite the calibration process. Our algorithm determines the parameter combinations that are more likely to produce desired outputs and therefore reduces the number of simulation runs performed during calibration. We demonstrate our method using the previously developed University of Wisconsin breast cancer simulation model (UWBCS). In a recent study, calibration of the UWBCS required the evaluation of 378 000 input parameter combinations to build a race-specific model, and only 69 of these combinations produced results that closely matched observed data. By using the active learning algorithm in conjunction with standard calibration methods, we identify all 69 parameter combinations by evaluating only 5620 of the 378 000 combinations. Machine learning methods hold potential in guiding model developers in the selection of more promising parameter combinations and hence speeding up the calibration process. Applying our machine learning algorithm to one model shows that evaluating only 1.49% of all parameter combinations would be sufficient for the calibration. © The Author(s) 2015.

  2. Using Active Learning for Speeding up Calibration in Simulation Models

    PubMed Central

    Cevik, Mucahit; Ali Ergun, Mehmet; Stout, Natasha K.; Trentham-Dietz, Amy; Craven, Mark; Alagoz, Oguzhan

    2015-01-01

    Background Most cancer simulation models include unobservable parameters that determine the disease onset and tumor growth. These parameters play an important role in matching key outcomes such as cancer incidence and mortality and their values are typically estimated via lengthy calibration procedure, which involves evaluating large number of combinations of parameter values via simulation. The objective of this study is to demonstrate how machine learning approaches can be used to accelerate the calibration process by reducing the number of parameter combinations that are actually evaluated. Methods Active learning is a popular machine learning method that enables a learning algorithm such as artificial neural networks to interactively choose which parameter combinations to evaluate. We develop an active learning algorithm to expedite the calibration process. Our algorithm determines the parameter combinations that are more likely to produce desired outputs, therefore reduces the number of simulation runs performed during calibration. We demonstrate our method using previously developed University of Wisconsin Breast Cancer Simulation Model (UWBCS). Results In a recent study, calibration of the UWBCS required the evaluation of 378,000 input parameter combinations to build a race-specific model and only 69 of these combinations produced results that closely matched observed data. By using the active learning algorithm in conjunction with standard calibration methods, we identify all 69 parameter combinations by evaluating only 5620 of the 378,000 combinations. Conclusion Machine learning methods hold potential in guiding model developers in the selection of more promising parameter combinations and hence speeding up the calibration process. Applying our machine learning algorithm to one model shows that evaluating only 1.49% of all parameter combinations would be sufficient for the calibration. PMID:26471190

  3. Self-assembly kinetics of microscale components: A parametric evaluation

    NASA Astrophysics Data System (ADS)

    Carballo, Jose M.

    The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.

  4. Modelling evapotranspiration during precipitation deficits: Identifying critical processes in a land surface model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ukkola, Anna M.; Pitman, Andy J.; Decker, Mark

    Surface fluxes from land surface models (LSMs) have traditionally been evaluated against monthly, seasonal or annual mean states. The limited ability of LSMs to reproduce observed evaporative fluxes under water-stressed conditions has been previously noted, but very few studies have systematically evaluated these models during rainfall deficits. We evaluated latent heat fluxes simulated by the Community Atmosphere Biosphere Land Exchange (CABLE) LSM across 20 flux tower sites at sub-annual to inter-annual timescales, in particular focusing on model performance during seasonal-scale rainfall deficits. The importance of key model processes in capturing the latent heat flux was explored by employing alternative representations of hydrology, leafmore » area index, soil properties and stomatal conductance. We found that the representation of hydrological processes was critical for capturing observed declines in latent heat during rainfall deficits. By contrast, the effects of soil properties, LAI and stomatal conductance were highly site-specific. Whilst the standard model performs reasonably well at annual scales as measured by common metrics, it grossly underestimates latent heat during rainfall deficits. A new version of CABLE, with a more physically consistent representation of hydrology, captures the variation in the latent heat flux during seasonal-scale rainfall deficits better than earlier versions, but remaining biases point to future research needs. Lastly, our results highlight the importance of evaluating LSMs under water-stressed conditions and across multiple plant functional types and climate regimes.« less

  5. Modelling evapotranspiration during precipitation deficits: Identifying critical processes in a land surface model

    DOE PAGES

    Ukkola, Anna M.; Pitman, Andy J.; Decker, Mark; ...

    2016-06-21

    Surface fluxes from land surface models (LSMs) have traditionally been evaluated against monthly, seasonal or annual mean states. The limited ability of LSMs to reproduce observed evaporative fluxes under water-stressed conditions has been previously noted, but very few studies have systematically evaluated these models during rainfall deficits. We evaluated latent heat fluxes simulated by the Community Atmosphere Biosphere Land Exchange (CABLE) LSM across 20 flux tower sites at sub-annual to inter-annual timescales, in particular focusing on model performance during seasonal-scale rainfall deficits. The importance of key model processes in capturing the latent heat flux was explored by employing alternative representations of hydrology, leafmore » area index, soil properties and stomatal conductance. We found that the representation of hydrological processes was critical for capturing observed declines in latent heat during rainfall deficits. By contrast, the effects of soil properties, LAI and stomatal conductance were highly site-specific. Whilst the standard model performs reasonably well at annual scales as measured by common metrics, it grossly underestimates latent heat during rainfall deficits. A new version of CABLE, with a more physically consistent representation of hydrology, captures the variation in the latent heat flux during seasonal-scale rainfall deficits better than earlier versions, but remaining biases point to future research needs. Lastly, our results highlight the importance of evaluating LSMs under water-stressed conditions and across multiple plant functional types and climate regimes.« less

  6. Modeling the Hydrologic Processes of a Permeable Pavement System

    EPA Science Inventory

    A permeable pavement system can capture stormwater to reduce runoff volume and flow rate, improve onsite groundwater recharge, and enhance pollutant controls within the site. A new unit process model for evaluating the hydrologic performance of a permeable pavement system has be...

  7. Multiobjective Sensitivity Analysis Of Sediment And Nitrogen Processes With A Watershed Model

    EPA Science Inventory

    This paper presents a computational analysis for evaluating critical non-point-source sediment and nutrient (specifically nitrogen) processes and management actions at the watershed scale. In the analysis, model parameters that bear key uncertainties were presumed to reflect the ...

  8. The heuristic-analytic theory of reasoning: extension and evaluation.

    PubMed

    Evans, Jonathan St B T

    2006-06-01

    An extensively revised heuristic-analytic theory of reasoning is presented incorporating three principles of hypothetical thinking. The theory assumes that reasoning and judgment are facilitated by the formation of epistemic mental models that are generated one at a time (singularity principle) by preconscious heuristic processes that contextualize problems in such a way as to maximize relevance to current goals (relevance principle). Analytic processes evaluate these models but tend to accept them unless there is good reason to reject them (satisficing principle). At a minimum, analytic processing of models is required so as to generate inferences or judgments relevant to the task instructions, but more active intervention may result in modification or replacement of default models generated by the heuristic system. Evidence for this theory is provided by a review of a wide range of literature on thinking and reasoning.

  9. Method Development for Clinical Comprehensive Evaluation of Pediatric Drugs Based on Multi-Criteria Decision Analysis: Application to Inhaled Corticosteroids for Children with Asthma.

    PubMed

    Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling

    2018-04-01

    Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.

  10. Modeling and Advanced Control for Sustainable Process Systems

    EPA Science Inventory

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-insp...

  11. Evaluating the Facilities Planning, Design, and Construction Department: The Capital Programs Management Audit.

    ERIC Educational Resources Information Center

    Kaiser, Harvey H.; Kirkwood, Dennis M.

    2000-01-01

    Presents a diagnostic model for assessing the state of an institution's capital programs management (CPM) by delineating "work processes" which comprise that function. What capital programs management is, its resources, and its phases and work processes are described, followed by case studies of the CPM Process Model as an assessment tool. (GR)

  12. VARTM Variability and Substantiation

    DOT National Transportation Integrated Search

    2008-06-18

    Presentation overview: establish the fundamental understanding of the various VARTM processes; Flow model is fully developed for SCRIMP, VAP, and CAPRI process; Compaction behavior has been evaluated for all processes; Dry compaction during debulking...

  13. Advanced Method to Estimate Fuel Slosh Simulation Parameters

    NASA Technical Reports Server (NTRS)

    Schlee, Keith; Gangadharan, Sathya; Ristow, James; Sudermann, James; Walker, Charles; Hubert, Carl

    2005-01-01

    The nutation (wobble) of a spinning spacecraft in the presence of energy dissipation is a well-known problem in dynamics and is of particular concern for space missions. The nutation of a spacecraft spinning about its minor axis typically grows exponentially and the rate of growth is characterized by the Nutation Time Constant (NTC). For launch vehicles using spin-stabilized upper stages, fuel slosh in the spacecraft propellant tanks is usually the primary source of energy dissipation. For analytical prediction of the NTC this fuel slosh is commonly modeled using simple mechanical analogies such as pendulums or rigid rotors coupled to the spacecraft. Identifying model parameter values which adequately represent the sloshing dynamics is the most important step in obtaining an accurate NTC estimate. Analytic determination of the slosh model parameters has met with mixed success and is made even more difficult by the introduction of propellant management devices and elastomeric diaphragms. By subjecting full-sized fuel tanks with actual flight fuel loads to motion similar to that experienced in flight and measuring the forces experienced by the tanks these parameters can be determined experimentally. Currently, the identification of the model parameters is a laborious trial-and-error process in which the equations of motion for the mechanical analog are hand-derived, evaluated, and their results are compared with the experimental results. The proposed research is an effort to automate the process of identifying the parameters of the slosh model using a MATLAB/SimMechanics-based computer simulation of the experimental setup. Different parameter estimation and optimization approaches are evaluated and compared in order to arrive at a reliable and effective parameter identification process. To evaluate each parameter identification approach, a simple one-degree-of-freedom pendulum experiment is constructed and motion is induced using an electric motor. By applying the estimation approach to a simple, accurately modeled system, its effectiveness and accuracy can be evaluated. The same experimental setup can then be used with fluid-filled tanks to further evaluate the effectiveness of the process. Ultimately, the proven process can be applied to the full-sized spinning experimental setup to quickly and accurately determine the slosh model parameters for a particular spacecraft mission. Automating the parameter identification process will save time, allow more changes to be made to proposed designs, and lower the cost in the initial design stages.

  14. Mechanisms of Family Impact on African American Adolescents’ HIV-Related Behavior

    PubMed Central

    Kogan, Steven M.; Brody, Gene H.; Gibbons, Frederick X.; Chen, Yi-fu; Grange, Christina M.; Simons, Ronald L.; Gerrard, Meg; Cutrona, Carolyn E.

    2010-01-01

    A longitudinal model that tested mediating pathways between protective family processes and HIV-related behavior was evaluated with 195 African American youth. Three waves of data were collected when the youth were 13, 15, and 19 years old. Evidence of mediation and temporal priority were assessed for three constructs: academic engagement, evaluations of prototypical risk-taking peers, and affiliations with risk-promoting peers. Structural equation modeling indicated that protective family processes assessed during early adolescence were associated with HIV-related behavior during emerging adulthood and that academic engagement, evaluations of prototypical risk-taking peers, and affiliations with risk-promoting peers accounted for this association. Evidence of a specific pathway emerged: protective family processes → academic engagement negative → evaluations of prototypical risk-taking peers→ affiliations with risk-promoting peers→ HIV-related behavior. Academic engagement also was a direct predictor of HIV-related risk behavior. PMID:21643492

  15. The Latent Curve ARMA (P, Q) Panel Model: Longitudinal Data Analysis in Educational Research and Evaluation

    ERIC Educational Resources Information Center

    Sivo, Stephen; Fan, Xitao

    2008-01-01

    Autocorrelated residuals in longitudinal data are widely reported as common to longitudinal data. Yet few, if any, researchers modeling growth processes evaluate a priori whether their data have this feature. Sivo, Fan, and Witta (2005) found that not modeling autocorrelated residuals present in longitudinal data severely biases latent curve…

  16. Evaluation of average daily gain predictions by the integrated farm system model for forage-finished beef steers

    USDA-ARS?s Scientific Manuscript database

    Representing the performance of cattle finished on an all forage diet in process-based whole farm system models has presented a challenge. To address this challenge, a study was done to evaluate average daily gain (ADG) predictions of the Integrated Farm System Model (IFSM) for steers consuming all-...

  17. Field evaluations of a forestry version of DRAINMOD-NII model

    Treesearch

    S. Tian; M. A. Youssef; R.W. Skaggs; D.M. Amatya; G.M. Chescheir

    2010-01-01

    This study evaluated the performance of the newly developed forestry version of DRAINMOD-NII model using a long term (21-year) data set collected from an artificially drained loblolly pine (Pinus taeda L.) plantation in eastern North Carolina, U.S.A. The model simulates the main hydrological and biogeochemical processes in drained forested lands. The...

  18. Evaluation of forest snow processes models (SnowMKIP2)

    Treesearch

    Nick Rutter; Richard Essery; John Pomeroy; Nuria Altimir; Kostas Andreadis; Ian Baker; Alan Barr; Paul Bartlett; Aaron Boone; Huiping Deng; Herve Douville; Emanuel Dutra; Kelly Elder; others

    2009-01-01

    Thirty-three snowpack models of varying complexity and purpose were evaluated across a wide range of hydrometeorological and forest canopy conditions at five Northern Hemisphere locations, for up to two winter snow seasons. Modeled estimates of snow water equivalent (SWE) or depth were compared to observations at forest and open sites at each location. Precipitation...

  19. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  20. Validating and determining the weight of items used for evaluating clinical governance implementation based on analytic hierarchy process model.

    PubMed

    Hooshmand, Elaheh; Tourani, Sogand; Ravaghi, Hamid; Vafaee Najar, Ali; Meraji, Marziye; Ebrahimipour, Hossein

    2015-04-08

    The purpose of implementing a system such as Clinical Governance (CG) is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP) model. The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients' non-medical needs, complaints and patients' participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients' non-medical needs, patients' participation in the treatment process and research and development. The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety. © 2015 by Kerman University of Medical Sciences.

  1. Validation of the NASA Dryden X-31 simulation and evaluation of mechanization techniques

    NASA Technical Reports Server (NTRS)

    Dickes, Edward; Kay, Jacob; Ralston, John

    1994-01-01

    This paper shall discuss the evaluation of the original Dryden X-31 aerodynamic math model, processes involved in the justification and creation of the modified data base, and comparison time history results of the model response with flight test.

  2. Chemical kinetic and photochemical data for use in stratospheric modelling

    NASA Technical Reports Server (NTRS)

    Demore, W. B.; Stief, L. J.; Kaufman, F.; Golden, D. M.; Hampton, R. F.; Kurylo, M. J.; Margitan, J. J.; Molina, M. J.; Watson, R. T.

    1979-01-01

    An evaluated set of rate constants and photochemical cross sections were compiled for use in modelling stratospheric processes. The data are primarily relevant to the ozone layer, and its possible perturbation by anthropogenic activities. The evaluation is current to, approximately, January, 1979.

  3. A Survey of Computer Science Capstone Course Literature

    ERIC Educational Resources Information Center

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  4. Chemical kinetics and photochemical data for use in stratospheric modeling. Evaluation number 6

    NASA Technical Reports Server (NTRS)

    Demore, W. B.; Molina, M. J.; Watson, R. T.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.

    1983-01-01

    Evaluated sets of rate constants and photochemical cross sections are presented. The primary application of the data is in the modeling of stratospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  5. A Model for Implementing a Career Education System.

    ERIC Educational Resources Information Center

    Ryan, T. Antoinette

    The model for career education implementation defines three major functions which constitute the essential elements in the implementation process: planning, implementation, and evaluation. Emphasis is placed on the interrelatedness of implementation to both planning and evaluation of career education. The 11 subsystems involved in implementing…

  6. Chemical kinetics and photochemical data for use in stratospheric modeling: Evaluation number 5

    NASA Technical Reports Server (NTRS)

    Demore, W. B.

    1982-01-01

    Sets of rate constants and photochemical cross sections compiled which were evaluated. The primary application of the data is in the modeling of stratospheric processes on the ozone layer and its possible perturbation by anthropogenic and natural phenomena are emphasized.

  7. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    EPA Science Inventory

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  8. [Simulation and data analysis of stereological modeling based on virtual slices].

    PubMed

    Wang, Hao; Shen, Hong; Bai, Xiao-yan

    2008-05-01

    To establish a computer-assisted stereological model for simulating the process of slice section and evaluate the relationship between section surface and estimated three-dimensional structure. The model was designed by mathematic method as a win32 software based on the MFC using Microsoft visual studio as IDE for simulating the infinite process of sections and analysis of the data derived from the model. The linearity of the fitting of the model was evaluated by comparison with the traditional formula. The win32 software based on this algorithm allowed random sectioning of the particles distributed randomly in an ideal virtual cube. The stereological parameters showed very high throughput (>94.5% and 92%) in homogeneity and independence tests. The data of density, shape and size of the section were tested to conform to normal distribution. The output of the model and that from the image analysis system showed statistical correlation and consistency. The algorithm we described can be used for evaluating the stereologic parameters of the structure of tissue slices.

  9. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  10. Integrating Human Factors into Crew Exploration Vehicle (CEV) Design

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Holden, Kritina; Baggerman, Susan; Campbell, Paul

    2007-01-01

    The purpose of this design process is to apply Human Engineering (HE) requirements and guidelines to hardware/software and to provide HE design, analysis and evaluation of crew interfaces. The topics include: 1) Background/Purpose; 2) HE Activities; 3) CASE STUDY: Net Habitable Volume (NHV) Study; 4) CASE STUDY: Human Modeling Approach; 5) CASE STUDY: Human Modeling Results; 6) CASE STUDY: Human Modeling Conclusions; 7) CASE STUDY: Human-in-the-Loop Evaluation Approach; 8) CASE STUDY: Unsuited Evaluation Results; 9) CASE STUDY: Suited Evaluation Results; 10) CASE STUDY: Human-in-the-Loop Evaluation Conclusions; 11) Near-Term Plan; and 12) In Conclusion

  11. Regulatory ozone modeling: status, directions, and research needs.

    PubMed Central

    Georgopoulos, P G

    1995-01-01

    The Clean Air Act Amendments (CAAA) of 1990 have established selected comprehensive, three-dimensional, Photochemical Air Quality Simulation Models (PAQSMs) as the required regulatory tools for analyzing the urban and regional problem of high ambient ozone levels across the United States. These models are currently applied to study and establish strategies for meeting the National Ambient Air Quality Standard (NAAQS) for ozone in nonattainment areas; State Implementation Plans (SIPs) resulting from these efforts must be submitted to the U.S. Environmental Protection Agency (U.S. EPA) in November 1994. The following presentation provides an overview and discussion of the regulatory ozone modeling process and its implications. First, the PAQSM-based ozone attainment demonstration process is summarized in the framework of the 1994 SIPs. Then, following a brief overview of the representation of physical and chemical processes in PAQSMs, the essential attributes of standard modeling systems currently in regulatory use are presented in a nonmathematical, self-contained format, intended to provide a basic understanding of both model capabilities and limitations. The types of air quality, emission, and meteorological data needed for applying and evaluating PAQSMs are discussed, as well as the sources, availability, and limitations of existing databases. The issue of evaluating a model's performance in order to accept it as a tool for policy making is discussed, and various methodologies for implementing this objective are summarized. Selected interim results from diagnostic analyses, which are performed as a component of the regulatory ozone modeling process for the Philadelphia-New Jersey region, are also presented to provide some specific examples related to the general issues discussed in this work. Finally, research needs related to a) the evaluation and refinement of regulatory ozone modeling, b) the characterization of uncertainty in photochemical modeling, and c) the improvement of the model-based ozone-attainment demonstration process are presented to identify future directions in this area. Images Figure 7. Figure 7. Figure 7. Figure 8. Figure 9. PMID:7614934

  12. EVALUATING THE REGIONAL PREDICTIVE CAPACITY OF A PROCESS-BASED MERCURY EXPOSURE MODEL (R-MCM) FOR LAKES ACROSS VERMONT AND NEW HAMPSHIRE, USA

    EPA Science Inventory

    Regulatory agencies are confronted with a daunting task of developing fish consumption advisories for a large number of lakes and rivers with little resources. A feasible mechanism to develop region-wide fish advisories is by using a process-based mathematical model. One model of...

  13. Evaluating crown fire rate of spread predictions from physics-based models

    Treesearch

    C. M. Hoffman; J. Ziegler; J. Canfield; R. R. Linn; W. Mell; C. H. Sieg; F. Pimont

    2015-01-01

    Modeling the behavior of crown fires is challenging due to the complex set of coupled processes that drive the characteristics of a spreading wildfire and the large range of spatial and temporal scales over which these processes occur. Detailed physics-based modeling approaches such as FIRETEC and the Wildland Urban Interface Fire Dynamics Simulator (WFDS) simulate...

  14. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  15. Enhanced human service transportation models : joint demonstration. Phase 1, System planning and design process evaluation : baseline analysis

    DOT National Transportation Integrated Search

    2007-11-13

    This document presents the findings from the baseline phase of the evaluation of the process being used by eight sites to develop a design for a Travel Management Coordination Center (TMCC) for improved coordination of human service transportation wi...

  16. Evaluating Tenured Teachers: A Practical Approach.

    ERIC Educational Resources Information Center

    DePasquale, Daniel, Jr.

    1990-01-01

    Teachers with higher order needs benefit from expressing their creativity and exercising valued skills. The evaluation process should encourage experienced teachers to grow professionally and move toward self-actualization. The suggested evaluation model includes an evaluation conference, a choice of evaluation method, a planning conference, an…

  17. Numerical Modeling of River Ice Processes on the Lower Nelson River

    NASA Astrophysics Data System (ADS)

    Malenchak, Jarrod Joseph

    Water resource infrastructure in cold regions of the world can be significantly impacted by the existence of river ice. Major engineering concerns related to river ice include ice jam flooding, the design and operation of hydropower facilities and other hydraulic structures, water supplies, as well as ecological, environmental, and morphological effects. The use of numerical simulation models has been identified as one of the most efficient means by which river ice processes can be studied and the effects of river ice be evaluated. The continued advancement of these simulation models will help to develop new theories and evaluate potential mitigation alternatives for these ice issues. In this thesis, a literature review of existing river ice numerical models, of anchor ice formation and modeling studies, and of aufeis formation and modeling studies is conducted. A high level summary of the two-dimensional CRISSP numerical model is presented as well as the developed freeze-up model with a focus specifically on the anchor ice and aufeis growth processes. This model includes development in the detailed heat transfer calculations, an improved surface ice mass exchange model which includes the rapids entrainment process, and an improved dry bed treatment model along with the expanded anchor ice and aufeis growth model. The developed sub-models are tested in an ideal channel setting as somewhat of a model confirmation. A case study of significant anchor ice and aufeis growth on the Nelson River in northern Manitoba, Canada, will be the primary field test case for the anchor ice and aufeis model. A second case study on the same river will be used to evaluate the surface ice components of the model in a field setting. The results from these cases studies will be used to highlight the capabilities and deficiencies in the numerical model and to identify areas of further research and model development.

  18. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  19. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  20. Exploring Learning-Oriented Assessment Processes

    ERIC Educational Resources Information Center

    Carless, David

    2015-01-01

    This paper proposes a model of learning-oriented assessment to inform assessment theory and practice. The model focuses on three interrelated processes: the assessment tasks which students undertake; students' development of self-evaluative capacities; and student engagement with feedback. These three strands are explored through the analysis of…

  1. Benchmark Simulation Model No 2: finalisation of plant layout and default control strategy.

    PubMed

    Nopens, I; Benedetti, L; Jeppsson, U; Pons, M-N; Alex, J; Copp, J B; Gernaey, K V; Rosen, C; Steyer, J-P; Vanrolleghem, P A

    2010-01-01

    The COST/IWA Benchmark Simulation Model No 1 (BSM1) has been available for almost a decade. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the research work related to the benchmark simulation models has resulted in more than 300 publications worldwide demonstrates the interest in and need of such tools within the research community. Recent efforts within the IWA Task Group on "Benchmarking of control strategies for WWTPs" have focused on an extension of the benchmark simulation model. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently, includes both pretreatment of wastewater as well as the processes describing sludge treatment. The motivation for the extension is the increasing interest and need to operate and control wastewater treatment systems not only at an individual process level but also on a plant-wide basis. To facilitate the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one week BSM1 evaluation period. In this paper, the finalised plant layout is summarised and, as was done for BSM1, a default control strategy is proposed. A demonstration of how BSM2 can be used to evaluate control strategies is also given.

  2. Evaluation of a Multiple Mediator Model of the Relationship between Core Self-Evaluations and Job Satisfaction in Employed Individuals with Disabilities

    ERIC Educational Resources Information Center

    Smedema, Susan Miller; Kesselmayer, Rachel Friefeld; Peterson, Lauren

    2018-01-01

    Purpose: To test a meditation model of the relationship between core self-evaluations (CSE) and job satisfaction in employed individuals with disabilities. Method: A quantitative descriptive design using Hayes's (2012) PROCESS macro for SPSS and multiple regression analysis. Two-hundred fifty-nine employed persons with disabilities were recruited…

  3. Evaluating Computer-Based Assessment in a Risk-Based Model

    ERIC Educational Resources Information Center

    Zakrzewski, Stan; Steven, Christine; Ricketts, Chris

    2009-01-01

    There are three purposes for evaluation: evaluation for action to aid the decision making process, evaluation for understanding to further enhance enlightenment and evaluation for control to ensure compliance to standards. This article argues that the primary function of evaluation in the "Catherine Wheel" computer-based assessment (CBA)…

  4. Options as information: rational reversals of evaluation and preference.

    PubMed

    Sher, Shlomi; McKenzie, Craig R M

    2014-06-01

    This article develops a rational analysis of an important class of apparent preference reversals-joint-separate reversals traditionally explained by the evaluability hypothesis. The "options-as-information" model considers a hypothetical rational actor with limited knowledge about the market distribution of a stimulus attribute. The actor's evaluations are formed via a 2-stage process-an inferential stage in which beliefs are updated on the basis of the sample of options received, followed by an assessment stage in which options are evaluated in light of these updated beliefs. This process generates joint-separate reversals in standard experimental designs. The normative model explains why the evaluability hypothesis works when it does, identifies boundary conditions for the hypothesis, and clarifies some common misconceptions about these effects. In particular, it implies that joint-separate reversals are not irrational; in fact, they are not preference reversals. However, in expanded designs where more than 2 options are jointly evaluated, the model predicts that genuine (and rational) preference reversals will sometimes emerge. Results of 3 experiments suggest an excellent fit between the rational actor model and the judgments of human actors in joint-separate experiments. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  5. Evaluating Vertical Moisture Structure of the Madden-Julian Oscillation in Contemporary GCMs

    NASA Astrophysics Data System (ADS)

    Guan, B.; Jiang, X.; Waliser, D. E.

    2013-12-01

    The Madden-Julian Oscillation (MJO) remains a major challenge in our understanding and modeling of the tropical convection and circulation. Many models have troubles in realistically simulating key characteristics of the MJO, such as the strength, period, and eastward propagation. For models that do simulate aspects of the MJO, it remains to be understood what parameters and processes are the most critical in determining the quality of the simulations. This study focuses on the vertical structure of moisture in MJO simulations, with the aim to identify and understand the relationship between MJO simulation qualities and key parameters related to moisture. A series of 20-year simulations conducted by 26 GCMs are analyzed, including four that are coupled to ocean models and two that have a two-dimensional cloud resolving model embedded (i.e., superparameterized). TRMM precipitation and ERA-Interim reanalysis are used to evaluate the model simulations. MJO simulation qualities are evaluated based on pattern correlations of lead/lag regressions of precipitation - a measure of the model representation of the eastward propagating MJO convection. Models with strongest and weakest MJOs (top and bottom quartiles) are compared in terms of differences in moisture content, moisture convergence, moistening rate, and moist static energy. It is found that models with strongest MJOs have better representations of the observed vertical tilt of moisture. Relative importance of convection, advection, boundary layer, and large scale convection/precipitation are discussed in terms of their contribution to the moistening process. The results highlight the overall importance of vertical moisture structure in MJO simulations. The work contributes to the climatological component of the joint WCRP-WWRP/THORPEX YOTC MJO Task Force and the GEWEX Atmosphere System Study (GASS) global model evaluation project focused on the vertical structure and diabatic processes of the MJO.

  6. FLBEIA : A simulation model to conduct Bio-Economic evaluation of fisheries management strategies

    NASA Astrophysics Data System (ADS)

    Garcia, Dorleta; Sánchez, Sonia; Prellezo, Raúl; Urtizberea, Agurtzane; Andrés, Marga

    Fishery systems are complex systems that need to be managed in order to ensure a sustainable and efficient exploitation of marine resources. Traditionally, fisheries management has relied on biological models. However, in recent years the focus on mathematical models which incorporate economic and social aspects has increased. Here, we present FLBEIA, a flexible software to conduct bio-economic evaluation of fisheries management strategies. The model is multi-stock, multi-fleet, stochastic and seasonal. The fishery system is described as a sum of processes, which are internally assembled in a predetermined way. There are several functions available to describe the dynamic of each process and new functions can be added to satisfy specific requirements.

  7. Process Engineering with the Evolutionary Spiral Process Model. Version 01.00.06

    DTIC Science & Technology

    1994-01-01

    program . Process Definition and SPC-92041-CMC Provides methods for defining and Modeling Guidebook documenting processes so they can be analyzed, modified...and Program Evaluation and Review Technique (PERT) support the activity of developing a project schedule. A variety of automated tools, such as...keep the organiza- tion from becoming disoriented during the improvement program (Curtis, Kellner, and Over 1992). Analyzing and documenting how

  8. Fuzzy Evaluating Customer Satisfaction of Jet Fuel Companies

    NASA Astrophysics Data System (ADS)

    Cheng, Haiying; Fang, Guoyi

    Based on the market characters of jet fuel companies, the paper proposes an evaluation index system of jet fuel company customer satisfaction from five dimensions as time, business, security, fee and service. And a multi-level fuzzy evaluation model composing with the analytic hierarchy process approach and fuzzy evaluation approach is given. Finally a case of one jet fuel company customer satisfaction evaluation is studied and the evaluation results response the feelings of the jet fuel company customers, which shows the fuzzy evaluation model is effective and efficient.

  9. Modeling and evaluation of the oil-spill emergency response capability based on linguistic variables.

    PubMed

    Kang, Jian; Zhang, Jixin; Bai, Yongqiang

    2016-12-15

    An evaluation of the oil-spill emergency response capability (OS-ERC) currently in place in modern marine management is required to prevent pollution and loss accidents. The objective of this paper is to develop a novel OS-ERC evaluation model, the importance of which stems from the current lack of integrated approaches for interpreting, ranking and assessing OS-ERC performance factors. In the first part of this paper, the factors influencing OS-ERC are analyzed and classified to generate a global evaluation index system. Then, a semantic tree is adopted to illustrate linguistic variables in the evaluation process, followed by the application of a combination of Fuzzy Cognitive Maps (FCM) and the Analytic Hierarchy Process (AHP) to construct and calculate the weight distribution. Finally, considering that the OS-ERC evaluation process is a complex system, a fuzzy comprehensive evaluation (FCE) is employed to calculate the OS-ERC level. The entire evaluation framework obtains the overall level of OS-ERC, and also highlights the potential major issues concerning OS-ERC, as well as expert opinions for improving the feasibility of oil-spill accident prevention and protection. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Modeling phosphorus removal and recovery from anaerobic digester supernatant through struvite crystallization in a fluidized bed reactor.

    PubMed

    Rahaman, Md Saifur; Mavinic, Donald S; Meikleham, Alexandra; Ellis, Naoko

    2014-03-15

    The cost associated with the disposal of phosphate-rich sludge, the stringent regulations to limit phosphate discharge into aquatic environments, and resource shortages resulting from limited phosphorus rock reserves, have diverted attention to phosphorus recovery in the form of struvite (MAP: MgNH4PO4·6H2O) crystals, which can essentially be used as a slow release fertilizer. Fluidized-bed crystallization is one of the most efficient unit processes used in struvite crystallization from wastewater. In this study, a comprehensive mathematical model, incorporating solution thermodynamics, struvite precipitation kinetics and reactor hydrodynamics, was developed to illustrate phosphorus depletion through struvite crystal growth in a continuous, fluidized-bed crystallizer. A thermodynamic equilibrium model for struvite precipitation was linked to the fluidized-bed reactor model. While the equilibrium model provided information on supersaturation generation, the reactor model captured the dynamic behavior of the crystal growth processes, as well as the effect of the reactor hydrodynamics on the overall process performance. The model was then used for performance evaluation of the reactor, in terms of removal efficiencies of struvite constituent species (Mg, NH4 and PO4), and the average product crystal sizes. The model also determined the variation of species concentration of struvite within the crystal bed height. The species concentrations at two extreme ends (inlet and outlet) were used to evaluate the reactor performance. The model predictions provided a reasonably good fit with the experimental results for PO4-P, NH4-N and Mg removals. Predicated average crystal sizes also matched fairly well with the experimental observations. Therefore, this model can be used as a tool for performance evaluation and process optimization of struvite crystallization in a fluidized-bed reactor. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  11. Air Pollution Data for Model Evaluation and Application

    EPA Science Inventory

    One objective of designing an air pollution monitoring network is to obtain data for evaluating air quality models that are used in the air quality management process and scientific discovery.1.2 A common use is to relate emissions to air quality, including assessing ...

  12. Support of surgical process modeling by using adaptable software user interfaces

    NASA Astrophysics Data System (ADS)

    Neumuth, T.; Kaschek, B.; Czygan, M.; Goldstein, D.; Strauß, G.; Meixensberger, J.; Burgert, O.

    2010-03-01

    Surgical Process Modeling (SPM) is a powerful method for acquiring data about the evolution of surgical procedures. Surgical Process Models are used in a variety of use cases including evaluation studies, requirements analysis and procedure optimization, surgical education, and workflow management scheme design. This work proposes the use of adaptive, situation-aware user interfaces for observation support software for SPM. We developed a method to support the modeling of the observer by using an ontological knowledge base. This is used to drive the graphical user interface for the observer to restrict the search space of terminology depending on the current situation. In the evaluation study it is shown, that the workload of the observer was decreased significantly by using adaptive user interfaces. 54 SPM observation protocols were analyzed by using the NASA Task Load Index and it was shown that the use of the adaptive user interface disburdens the observer significantly in workload criteria effort, mental demand and temporal demand, helping him to concentrate on his essential task of modeling the Surgical Process.

  13. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    NASA Astrophysics Data System (ADS)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  14. A Biopsychological Model of Anti-drug PSA Processing: Developing Effective Persuasive Messages.

    PubMed

    Hohman, Zachary P; Keene, Justin Robert; Harris, Breanna N; Niedbala, Elizabeth M; Berke, Collin K

    2017-11-01

    For the current study, we developed and tested a biopsychological model to combine research on psychological tension, the Limited Capacity Model of Motivated Mediated Message Processing, and the endocrine system to predict and understand how people process anti-drug PSAs. We predicted that co-presentation of pleasant and unpleasant information, vs. solely pleasant or unpleasant, will trigger evaluative tension about the target behavior in persuasive messages and result in a biological response (increase in cortisol, alpha amylase, and heart rate). In experiment 1, we assessed the impact of co-presentation of pleasant and unpleasant information in persuasive messages on evaluative tension (conceptualized as attitude ambivalence), in experiment 2, we explored the impact of co-presentation on endocrine system responses (salivary cortisol and alpha amylase), and in experiment 3, we assessed the impact of co-presentation on heart rate. Across all experiments, we demonstrated that co-presentation of pleasant and unpleasant information, vs. solely pleasant or unpleasant, in persuasive communications leads to increases in attitude ambivalence, salivary cortisol, salivary alpha amylase, and heart rate. Taken together, the results support the initial paths of our biopsychological model of persuasive message processing and indicate that including both pleasant and unpleasant information in a message impacts the viewer. We predict that increases in evaluative tension and biological responses will aid in memory and cognitive processing of the message. However, future research is needed to test that hypothesis.

  15. QaaS (quality as a service) model for web services using big data technologies

    NASA Astrophysics Data System (ADS)

    Ahmad, Faisal; Sarkar, Anirban

    2017-10-01

    Quality of service (QoS) determines the service usability and utility and both of which influence the service selection process. The QoS varies from one service provider to other. Each web service has its own methodology for evaluating QoS. The lack of transparent QoS evaluation model makes the service selection challenging. Moreover, most QoS evaluation processes do not consider their historical data which not only helps in getting more accurate QoS but also helps for future prediction, recommendation and knowledge discovery. QoS driven service selection demands a model where QoS can be provided as a service to end users. This paper proposes a layered QaaS (quality as a service) model in the same line as PaaS and software as a service, where users can provide QoS attributes as inputs and the model returns services satisfying the user's QoS expectation. This paper covers all the key aspects in this context, like selection of data sources, its transformation, evaluation, classification and storage of QoS. The paper uses server log as the source for evaluating QoS values, common methodology for its evaluation and big data technologies for its transformation and analysis. This paper also establishes the fact that Spark outperforms the Pig with respect to evaluation of QoS from logs.

  16. Nurturing Professional Growth: A Peer Review Model for Independent Evaluators

    ERIC Educational Resources Information Center

    Bond, Sally L.; Ray, Marilyn L.

    2006-01-01

    There has been a recent groundswell of support in the American Evaluation Association's Independent Consulting Topical Interest Group (IC TIG) for evaluating evaluators' work just as evaluators evaluate the work of their clients. To facilitate this self-evaluation, the IC TIG elected to create a peer review process that focuses on written…

  17. MIMO model of an interacting series process for Robust MPC via System Identification.

    PubMed

    Wibowo, Tri Chandra S; Saad, Nordin

    2010-07-01

    This paper discusses the empirical modeling using system identification technique with a focus on an interacting series process. The study is carried out experimentally using a gaseous pilot plant as the process, in which the dynamic of such a plant exhibits the typical dynamic of an interacting series process. Three practical approaches are investigated and their performances are evaluated. The models developed are also examined in real-time implementation of a linear model predictive control. The selected model is able to reproduce the main dynamic characteristics of the plant in open-loop and produces zero steady-state errors in closed-loop control system. Several issues concerning the identification process and the construction of a MIMO state space model for a series interacting process are deliberated. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Real-time slicing algorithm for Stereolithography (STL) CAD model applied in additive manufacturing industry

    NASA Astrophysics Data System (ADS)

    Adnan, F. A.; Romlay, F. R. M.; Shafiq, M.

    2018-04-01

    Owing to the advent of the industrial revolution 4.0, the need for further evaluating processes applied in the additive manufacturing application particularly the computational process for slicing is non-trivial. This paper evaluates a real-time slicing algorithm for slicing an STL formatted computer-aided design (CAD). A line-plane intersection equation was applied to perform the slicing procedure at any given height. The application of this algorithm has found to provide a better computational time regardless the number of facet in the STL model. The performance of this algorithm is evaluated by comparing the results of the computational time for different geometry.

  19. Hydrologic consistency as a basis for assessing complexity of monthly water balance models for the continental United States

    NASA Astrophysics Data System (ADS)

    Martinez, Guillermo F.; Gupta, Hoshin V.

    2011-12-01

    Methods to select parsimonious and hydrologically consistent model structures are useful for evaluating dominance of hydrologic processes and representativeness of data. While information criteria (appropriately constrained to obey underlying statistical assumptions) can provide a basis for evaluating appropriate model complexity, it is not sufficient to rely upon the principle of maximum likelihood (ML) alone. We suggest that one must also call upon a "principle of hydrologic consistency," meaning that selected ML structures and parameter estimates must be constrained (as well as possible) to reproduce desired hydrological characteristics of the processes under investigation. This argument is demonstrated in the context of evaluating the suitability of candidate model structures for lumped water balance modeling across the continental United States, using data from 307 snow-free catchments. The models are constrained to satisfy several tests of hydrologic consistency, a flow space transformation is used to ensure better consistency with underlying statistical assumptions, and information criteria are used to evaluate model complexity relative to the data. The results clearly demonstrate that the principle of consistency provides a sensible basis for guiding selection of model structures and indicate strong spatial persistence of certain model structures across the continental United States. Further work to untangle reasons for model structure predominance can help to relate conceptual model structures to physical characteristics of the catchments, facilitating the task of prediction in ungaged basins.

  20. Evaluating the Credibility of Transport Processes in the Global Modeling Initiative 3D Model Simulations of Ozone Recovery

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.

    2003-01-01

    The Global Modeling Initiative has integrated two 35-year simulations of an ozone recovery scenario with an offline chemistry and transport model using two different meteorological inputs. Physically based diagnostics, derived from satellite and aircraft data sets, are described and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barrier formation in the subtropics and polar regions, and extratropical wave-driven transport. Some diagnostics are especially relevant to simulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of meteorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a general circulation model (GMI(sub GCM)) showed a very good residual circulation in the tropics and northern hemisphere. The simulation with input from a data assimilation system (GMI(sub DAS)) performed better in the midlatitudes than at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GMI(sub GCM) has greater fidelity throughout the stratosphere than the GMI(sub DAS).

  1. Evaluate Yourself. Evaluation: Research-Based Decision Making Series, Number 9304.

    ERIC Educational Resources Information Center

    Fetterman, David M.

    This document considers both self-examination and external evaluation of gifted and talented education programs. Principles of the self-examination process are offered, noting similarities to external evaluation models. Principles of self-evaluation efforts include the importance of maintaining a nonjudgmental orientation, soliciting views from…

  2. Using a Model of Analysts' Judgments to Augment an Item Calibration Process

    ERIC Educational Resources Information Center

    Hauser, Carl; Thum, Yeow Meng; He, Wei; Ma, Lingling

    2015-01-01

    When conducting item reviews, analysts evaluate an array of statistical and graphical information to assess the fit of a field test (FT) item to an item response theory model. The process can be tedious, particularly when the number of human reviews (HR) to be completed is large. Furthermore, such a process leads to decisions that are susceptible…

  3. [Decision modeling for economic evaluation of health technologies].

    PubMed

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh

    2014-10-01

    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.

  4. Process-oriented Observational Metrics for CMIP6 Climate Model Assessments

    NASA Astrophysics Data System (ADS)

    Jiang, J. H.; Su, H.

    2016-12-01

    Observational metrics based on satellite observations have been developed and effectively applied during post-CMIP5 model evaluation and improvement projects. As new physics and parameterizations continue to be included in models for the upcoming CMIP6, it is important to continue objective comparisons between observations and model results. This talk will summarize the process-oriented observational metrics and methodologies for constraining climate models with A-Train satellite observations and support CMIP6 model assessments. We target parameters and processes related to atmospheric clouds and water vapor, which are critically important for Earth's radiative budget, climate feedbacks, and water and energy cycles, and thus reduce uncertainties in climate models.

  5. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  6. Post-processing techniques to enhance reliability of assignment algorithm based performance measures : [technical summary].

    DOT National Transportation Integrated Search

    2011-01-01

    Travel demand modeling plays a key role in the transportation system planning and evaluation process. The four-step sequential travel demand model is the most widely used technique in practice. Traffic assignment is the key step in the conventional f...

  7. Community-based health care for indigenous women in Mexico: a qualitative evaluation.

    PubMed

    Pelcastre-Villafuerte, Blanca; Ruiz, Myriam; Meneses, Sergio; Amaya, Claudia; Márquez, Margarita; Taboada, Arianna; Careaga, Katherine

    2014-01-06

    Indigenous women in Mexico represent a vulnerable population in which three kinds of discrimination converge (ethnicity, gender and class), having direct repercussions on health status. The discrimination and inequity in health care settings brought this population to the fore as a priority group for institutional action. The objective of this study was to evaluate the processes and performance of the "Casa de la Mujer Indígena", a community based project for culturally and linguistically appropriate service delivery for indigenous women. The evaluation summarizes perspectives from diverse stakeholders involved in the implementation of the model, including users, local authorities, and institutional representatives. The study covered five Casas implementation sites located in four Mexican states. A qualitative process evaluation focused on systematically analyzing the Casas project processes and performance was conducted using archival information and semi-structured interviews. Sixty-two interviews were conducted, and grounded theory approach was applied for data analysis. Few similarities were observed between the proposed model of service delivery and its implementation in diverse locations, signaling discordant operating processes. Evidence gathered from Casas personnel highlighted their ability to detect obstetric emergencies and domestic violence cases, as well as contribute to the empowerment of women in the indigenous communities served by the project. These themes directly translated to increases in the reporting of abuse and referrals for obstetric emergencies. The model's cultural and linguistic competency, and contributions to increased referrals for obstetric emergencies and abuse are notable successes. The flexibility and community-based nature of the model has allowed it to be adapted to the particularities of diverse indigenous contexts. Local, culturally appropriate implementation has been facilitated by the fact that the Casas have been implemented with local leadership and local women have taken ownership. Users express overall satisfaction with service delivery, while providing constructive feedback for the improvement of existing Casas, as well as more cost-effective implementation of the model in new sites. Integration of user's input obtained from this process evaluation into future planning will undoubtedly increase buy-in. The Casas model is pertinent and viable to other contexts where indigenous women experience disparities in care.

  8. Changing the world: the design and implementation of comprehensive continuous integrated systems of care for individuals with co-occurring disorders.

    PubMed

    Minkoff, Kenneth; Cline, Christie A

    2004-12-01

    This article has described the CCISC model and the process of implementation of systemic implementation of co-occurring disorder services enhancements within the context of existing resources. Four projects were described as illustrations of current implementation activities. Clearly, there is need for improved services for these individuals, and increasing recognition of the need for systemic change models that are effective and efficient. The CCISC model has been recognized by SAMHSA as a consensus best practice for system design, and initial efforts at implementation appear to be promising. The existing toolkit may permit a more formal process of data-driven evaluation of system, program, clinician, and client outcomes, to better measure the effectiveness of this approach. Some projects have begun such formal evaluation processes, but more work is needed, not only with individual projects, but also to develop opportunities for multi-system evaluation, as more projects come on line.

  9. Evaluation of Student Models on Current Socio-Scientific Topics Based on System Dynamics

    ERIC Educational Resources Information Center

    Nuhoglu, Hasret

    2014-01-01

    This study aims to 1) enable primary school students to develop models that will help them understand and analyze a system, through a learning process based on system dynamics approach, 2) examine and evaluate students' models related to socio-scientific issues using certain criteria. The research method used is a case study. The study sample…

  10. The Evaluation of Modelling Competences: Difficulties and Potentials for the Learning of the Sciences

    ERIC Educational Resources Information Center

    Lopes, J. Bernardino; Costa, Nilza

    2007-01-01

    Modelling is an inherent process for the construction and use of science concepts that mobilize diverse specific competences. The aims of this work are to put forward a means of evaluating modelling competences that is relevant for physics teaching and science education research and to identify the potentials and constraints in the development of…

  11. Evaluation of Disaster Preparedness Based on Simulation Exercises: A Comparison of Two Models.

    PubMed

    Rüter, Andres; Kurland, Lisa; Gryth, Dan; Murphy, Jason; Rådestad, Monica; Djalali, Ahmadreza

    2016-08-01

    The objective of this study was to highlight 2 models, the Hospital Incident Command System (HICS) and the Disaster Management Indicator model (DiMI), for evaluating the in-hospital management of a disaster situation through simulation exercises. Two disaster exercises, A and B, with similar scenarios were performed. Both exercises were evaluated with regard to actions, processes, and structures. After the exercises, the results were calculated and compared. In exercise A the HICS model indicated that 32% of the required positions for the immediate phase were taken under consideration with an average performance of 70%. For exercise B, the corresponding scores were 42% and 68%, respectively. According to the DiMI model, the results for exercise A were a score of 68% for management processes and 63% for management structure (staff skills). In B the results were 77% and 86%, respectively. Both models demonstrated acceptable results in relation to previous studies. More research in this area is needed to validate which of these methods best evaluates disaster preparedness based on simulation exercises or whether the methods are complementary and should therefore be used together. (Disaster Med Public Health Preparedness. 2016;10:544-548).

  12. Local spatio-temporal analysis in vision systems

    NASA Astrophysics Data System (ADS)

    Geisler, Wilson S.; Bovik, Alan; Cormack, Lawrence; Ghosh, Joydeep; Gildeen, David

    1994-07-01

    The aims of this project are the following: (1) develop a physiologically and psychophysically based model of low-level human visual processing (a key component of which are local frequency coding mechanisms); (2) develop image models and image-processing methods based upon local frequency coding; (3) develop algorithms for performing certain complex visual tasks based upon local frequency representations, (4) develop models of human performance in certain complex tasks based upon our understanding of low-level processing; and (5) develop a computational testbed for implementing, evaluating and visualizing the proposed models and algorithms, using a massively parallel computer. Progress has been substantial on all aims. The highlights include the following: (1) completion of a number of psychophysical and physiological experiments revealing new, systematic and exciting properties of the primate (human and monkey) visual system; (2) further development of image models that can accurately represent the local frequency structure in complex images; (3) near completion in the construction of the Texas Active Vision Testbed; (4) development and testing of several new computer vision algorithms dealing with shape-from-texture, shape-from-stereo, and depth-from-focus; (5) implementation and evaluation of several new models of human visual performance; and (6) evaluation, purchase and installation of a MasPar parallel computer.

  13. The Air Quality Model Evaluation International Initiative ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  14. M4SF-17LL010301071: Thermodynamic Database Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavarin, M.; Wolery, T. J.

    2017-09-05

    This progress report (Level 4 Milestone Number M4SF-17LL010301071) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within the Argillite Disposal R&D Work Package Number M4SF-17LL01030107. The DR Argillite Disposal R&D control account is focused on the evaluation of important processes in the analysis of disposal design concepts and related materials for nuclear fuel disposal in clay-bearing repository media. The objectives of this work package are to develop model tools for evaluating impacts of THMC process on long-term disposal of spent fuel in argillite rocks, and to establish the scientific basis for high thermal limits. This work is contributing tomore » the GDSA model activities to identify gaps, develop process models, provide parameter feeds and support requirements providing the capability for a robust repository performance assessment model by 2020.« less

  15. Atmospheric Modeling And Sensor Simulation (AMASS) study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1984-01-01

    The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.

  16. Final Report: Development of a Chemical Model to Predict the Interactions between Supercritical CO2, Fluid and Rock in EGS Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McPherson, Brian J.; Pan, Feng

    2014-09-24

    This report summarizes development of a coupled-process reservoir model for simulating enhanced geothermal systems (EGS) that utilize supercritical carbon dioxide as a working fluid. Specifically, the project team developed an advanced chemical kinetic model for evaluating important processes in EGS reservoirs, such as mineral precipitation and dissolution at elevated temperature and pressure, and for evaluating potential impacts on EGS surface facilities by related chemical processes. We assembled a new database for better-calibrated simulation of water/brine/ rock/CO2 interactions in EGS reservoirs. This database utilizes existing kinetic and other chemical data, and we updated those data to reflect corrections for elevated temperaturemore » and pressure conditions of EGS reservoirs.« less

  17. EVALUATING THE USE OF OUTPUTS FROM COMPREHENSIVE METEOROLOGICAL MODELS IN AIR QUALITY MODELING APPLICATIONS

    EPA Science Inventory

    Currently used dispersion models, such as the AMS/EPA Regulatory Model (AERMOD), process routinely available meteorological observations to construct model inputs. Thus, model estimates of concentrations depend on the availability and quality of Meteorological observations, as we...

  18. Designing an evaluation framework for WFME basic standards for medical education.

    PubMed

    Tackett, Sean; Grant, Janet; Mmari, Kristin

    2016-01-01

    To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.

  19. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  20. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    NASA Astrophysics Data System (ADS)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  1. Introducing Multisensor Satellite Radiance-Based Evaluation for Regional Earth System Modeling

    NASA Technical Reports Server (NTRS)

    Matsui, T.; Santanello, J.; Shi, J. J.; Tao, W.-K.; Wu, D.; Peters-Lidard, C.; Kemp, E.; Chin, M.; Starr, D.; Sekiguchi, M.; hide

    2014-01-01

    Earth System modeling has become more complex, and its evaluation using satellite data has also become more difficult due to model and data diversity. Therefore, the fundamental methodology of using satellite direct measurements with instrumental simulators should be addressed especially for modeling community members lacking a solid background of radiative transfer and scattering theory. This manuscript introduces principles of multisatellite, multisensor radiance-based evaluation methods for a fully coupled regional Earth System model: NASA-Unified Weather Research and Forecasting (NU-WRF) model. We use a NU-WRF case study simulation over West Africa as an example of evaluating aerosol-cloud-precipitation-land processes with various satellite observations. NU-WRF-simulated geophysical parameters are converted to the satellite-observable raw radiance and backscatter under nearly consistent physics assumptions via the multisensor satellite simulator, the Goddard Satellite Data Simulator Unit. We present varied examples of simple yet robust methods that characterize forecast errors and model physics biases through the spatial and statistical interpretation of various satellite raw signals: infrared brightness temperature (Tb) for surface skin temperature and cloud top temperature, microwave Tb for precipitation ice and surface flooding, and radar and lidar backscatter for aerosol-cloud profiling simultaneously. Because raw satellite signals integrate many sources of geophysical information, we demonstrate user-defined thresholds and a simple statistical process to facilitate evaluations, including the infrared-microwave-based cloud types and lidar/radar-based profile classifications.

  2. Monetary and affective judgments of consumer goods: modes of evaluation matter.

    PubMed

    Seta, John J; Seta, Catherine E; McCormick, Michael; Gallagher, Ashleigh H

    2014-01-01

    Participants who evaluated 2 positively valued items separately reported more positive attraction (using affective and monetary measures) than those who evaluated the same two items as a unit. In Experiments 1-3, this separate/unitary evaluation effect was obtained when participants evaluated products that they were purchasing for a friend. Similar findings were obtained in Experiments 4 and 5 when we considered the amount participants were willing to spend to purchase insurance for items that they currently owned. The averaging/summation model was contrasted with several theoretical perspectives and implicated averaging and summation integration processes in how items are evaluated. The procedural and theoretical similarities and differences between this work and related research on unpacking, comparison processes, public goods, and price bundling are discussed. Overall, the results support the operation of integration processes and contribute to an understanding of how these processes influence the evaluation and valuation of private goods.

  3. Evaluating Innovation and Navigating Unseen Boundaries: Systems, Processes and People

    ERIC Educational Resources Information Center

    Fleet, Alma; De Gioia, Katey; Madden, Lorraine; Semann, Anthony

    2018-01-01

    This paper illustrates an evaluation model emerging from Australian research. With reference to a range of contexts, its usefulness is demonstrated through application to two professional development initiatives designed to improve continuity of learning in the context of the transition to school. The model reconceptualises approaches to…

  4. Evaluating the mitigation of greenhouse gas emissions and adaptation in dairy production.

    USDA-ARS?s Scientific Manuscript database

    Process-level modeling at the farm scale provides a tool for evaluating strategies for both mitigating greenhouse gas emissions and adapting to climate change. The Integrated Farm System Model (IFSM) simulates representative crop, beef or dairy farms over many years of weather to predict performance...

  5. Diagnostic Evaluation of Ozone Production and Horizontal Transport in a Regional Photochemical Air Quality Modeling System

    EPA Science Inventory

    A diagnostic model evaluation effort has been performed to focus on photochemical ozone formation and the horizontal transport process since they strongly impact the temporal evolution and spatial distribution of ozone (O3) within the lower troposphere. Results from th...

  6. Evaluating and Improving Cloud Processes in the Multi-Scale Modeling Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackerman, Thomas P.

    2015-03-01

    The research performed under this grant was intended to improve the embedded cloud model in the Multi-scale Modeling Framework (MMF) for convective clouds by using a 2-moment microphysics scheme rather than the single moment scheme used in all the MMF runs to date. The technical report and associated documents describe the results of testing the cloud resolving model with fixed boundary conditions and evaluation of model results with data. The overarching conclusion is that such model evaluations are problematic because errors in the forcing fields control the results so strongly that variations in parameterization values cannot be usefully constrained

  7. Evolution in Cloud Population Statistics of the MJO: From AMIE Field Observations to Global Cloud-Permiting Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Chidong

    Motivated by the success of the AMIE/DYNAMO field campaign, which collected unprecedented observations of cloud and precipitation from the tropical Indian Ocean in Octber 2011 – March 2012, this project explored how such observations can be applied to assist the development of global cloud-permitting models through evaluating and correcting model biases in cloud statistics. The main accomplishment of this project were made in four categories: generating observational products for model evaluation, using AMIE/DYNAMO observations to validate global model simulations, using AMIE/DYNAMO observations in numerical studies of cloud-permitting models, and providing leadership in the field. Results from this project provide valuablemore » information for building a seamless bridge between DOE ASR program’s component on process level understanding of cloud processes in the tropics and RGCM focus on global variability and regional extremes. In particular, experience gained from this project would be directly applicable to evaluation and improvements of ACME, especially as it transitions to a non-hydrostatic variable resolution model.« less

  8. A Study of Implementation in Seven Follow Through Educational Models and How Instructional Processes Relate to Child Outcomes.

    ERIC Educational Resources Information Center

    Stallings, Jane

    The purpose of the Follow Through Classroom Observation Evaluation was to assess the implementation of seven Follow Through sponsor models included in the study and to examine the relationships between classroom instructional processes and child outcomes. The seven programs selected for study include two behavioristic models, an open school model…

  9. Multimedia modeling of engineered nanoparticles with SimpleBox4nano: model definition and evaluation.

    PubMed

    Meesters, Johannes A J; Koelmans, Albert A; Quik, Joris T K; Hendriks, A Jan; van de Meent, Dik

    2014-05-20

    Screening level models for environmental assessment of engineered nanoparticles (ENP) are not generally available. Here, we present SimpleBox4Nano (SB4N) as the first model of this type, assess its validity, and evaluate it by comparisons with a known material flow model. SB4N expresses ENP transport and concentrations in and across air, rain, surface waters, soil, and sediment, accounting for nanospecific processes such as aggregation, attachment, and dissolution. The model solves simultaneous mass balance equations (MBE) using simple matrix algebra. The MBEs link all concentrations and transfer processes using first-order rate constants for all processes known to be relevant for ENPs. The first-order rate constants are obtained from the literature. The output of SB4N is mass concentrations of ENPs as free dispersive species, heteroaggregates with natural colloids, and larger natural particles in each compartment in time and at steady state. Known scenario studies for Switzerland were used to demonstrate the impact of the transport processes included in SB4N on the prediction of environmental concentrations. We argue that SB4N-predicted environmental concentrations are useful as background concentrations in environmental risk assessment.

  10. Multiple attribute decision making model and application to food safety risk evaluation.

    PubMed

    Ma, Lihua; Chen, Hong; Yan, Huizhe; Yang, Lifeng; Wu, Lifeng

    2017-01-01

    Decision making for supermarket food purchase decisions are characterized by network relationships. This paper analyzed factors that influence supermarket food selection and proposes a supplier evaluation index system based on the whole process of food production. The author established the intuitive interval value fuzzy set evaluation model based on characteristics of the network relationship among decision makers, and validated for a multiple attribute decision making case study. Thus, the proposed model provides a reliable, accurate method for multiple attribute decision making.

  11. Chemical kinetics and photochemical data for use in stratospheric modeling

    NASA Technical Reports Server (NTRS)

    Demore, W. B.; Sander, S. P.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.; Kolb, C. E.; Molina, M. J.

    1992-01-01

    As part of a series of evaluated sets, rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation are provided. The primary application of the data is in the modeling of stratospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena. Copies of this evaluation are available from the Jet Propulsion Laboratory.

  12. An Exploration of Teachers' and Administrators' Perspectives: The Collaborative Process Using the Danielson Framework for Teaching Model

    ERIC Educational Resources Information Center

    Landolfi, Adrienne M.

    2016-01-01

    As accountability measures continue to increase within education, public school systems have integrated standards-based evaluation systems to formally assess professional practices among educators. The purpose of this study was to explore the extent in which the communication process between evaluators and teachers impacts teacher performance…

  13. Modeling and Simulation Roadmap to Enhance Electrical Energy Security of U.S. Naval Bases

    DTIC Science & Technology

    2012-03-01

    evaluating power system architectures and technologies and, therefore, can become a valuable tool for the implementation of the described plan for Navy...a well validated and consistent process for evaluating power system architectures and technologies and, therefore, can be a valuable tool for the...process for evaluating power system architectures and component technologies is needed to support the development and implementation of these new

  14. Low cost silicon solar array project large area silicon sheet task: Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Growth configurations were developed which produced crystals having low residual stress levels. The properties of a 106 mm diameter round crucible were evaluated and it was found that this design had greatly enhanced temperature fluctuations arising from convection in the melt. Thermal modeling efforts were directed to developing finite element models of the 106 mm round crucible and an elongated susceptor/crucible configuration. Also, the thermal model for the heat loss modes from the dendritic web was examined for guidance in reducing the thermal stress in the web. An economic analysis was prepared to evaluate the silicon web process in relation to price goals.

  15. Theory Building through Praxis Discourse: A Theory- and Practice-Informed Model of Transformative Participatory Evaluation

    ERIC Educational Resources Information Center

    Harnar, Michael A.

    2012-01-01

    Stakeholder participation in evaluation, where the evaluator engages stakeholders in the process, is prevalent in evaluation practice and is an important focus of evaluation research. Cousins and Whitmore proposed a bifurcation of participatory evaluation into the two streams of transformative participatory and practical participatory evaluation…

  16. NOAA Atmospheric Sciences Modeling Division support to the US Environmental Protection Agency

    NASA Astrophysics Data System (ADS)

    Poole-Kober, Evelyn M.; Viebrock, Herbert J.

    1991-07-01

    During FY-1990, the Atmospheric Sciences Modeling Division provided meteorological research and operational support to the U.S. Environmental Protection Agency. Basic meteorological operational support consisted of applying dispersion models and conducting dispersion studies and model evaluations. The primary research effort was the development and evaluation of air quality simulation models using numerical and physical techniques supported by field studies. Modeling emphasis was on the dispersion of photochemical oxidants and particulate matter on urban and regional scales, dispersion in complex terrain, and the transport, transformation, and deposition of acidic materials. Highlights included expansion of the Regional Acid Deposition Model/Engineering Model family to consist of the Tagged Species Engineering Model, the Non-Depleting Model, and the Sulfate Tracking Model; completion of the Acid-MODES field study; completion of the RADM2.1 evaluation; completion of the atmospheric processes section of the National Acid Precipitation Assessment Program 1990 Integrated Assessment; conduct of the first field study to examine the transport and entrainment processes of convective clouds; development of a Regional Oxidant Model-Urban Airshed Model interface program; conduct of an international sodar intercomparison experiment; incorporation of building wake dispersion in numerical models; conduct of wind-tunnel simulations of stack-tip downwash; and initiation of the publication of SCRAM NEWS.

  17. The role of interior watershed processes in improving parameter estimation and performance of watershed models

    USDA-ARS?s Scientific Manuscript database

    Watershed models typically are evaluated solely through comparison of in-stream water and nutrient fluxes with measured data using established performance criteria, whereas processes and responses within the interior of the watershed that govern these global fluxes often are neglected. Due to the l...

  18. Process Relationships for Evaluating the Role of Light-induced Inactivation of Enterococci at Selected Beaches and Nearby Tributaries of the Great Lakes

    EPA Science Inventory

    One approach to predictive modeling of biological contamination of recreational waters and drinking water sources involves applying process-based models that consider microbial sources, hydrodynamic transport, and microbial fate. Fecal indicator bacteria such as enterococci have ...

  19. Nurturing the Imagination: Creativity Processes and Innovative Qualitative Research Projects

    ERIC Educational Resources Information Center

    Mulvihill, Thalia M.; Swaminathan, Raji

    2012-01-01

    This article explores the creativity processes involved in designing and analyzing innovative qualitative research projects and evaluates examples of recent models and typologies that illustrate a variety of ways to approach qualitative inquiry. Using Gardner's Five Minds (2006) typology, Boyer's Model of Scholarship (1997) and Bloom's Taxonomy of…

  20. A model of evaluation planning, implementation and management: Toward a ?culture of information? within organizations

    NASA Astrophysics Data System (ADS)

    Bhola, H. S.

    1992-03-01

    The argument underlying the ongoing "paradigm shift" from logical positivism to constructionism is briefly laid out. A model of evaluation planning, implementation and management (called the P-I-M Model, for short) is then presented that assumes a complementarity between the two paradigms. The model further implies that for effective decision-making within human organizations, both "evaluative data" and "descriptive data" are needed. "Evaluative data" generated by evaluation studies must, therefore, be undergirded by an appropriate management information system (MIS) that can generate "descriptive data", concurrently with the process of program implementation. The P-I-M Model, if fully actualized, will enable human organizations to become vibrant "cultures of information" where "informed" decision-making becomes a shared norm among all stakeholders.

  1. Interactive knowledge discovery with the doctor-in-the-loop: a practical example of cerebral aneurysms research.

    PubMed

    Girardi, Dominic; Küng, Josef; Kleiser, Raimund; Sonnberger, Michael; Csillag, Doris; Trenkler, Johannes; Holzinger, Andreas

    2016-09-01

    Established process models for knowledge discovery find the domain-expert in a customer-like and supervising role. In the field of biomedical research, it is necessary to move the domain-experts into the center of this process with far-reaching consequences for both their research output and the process itself. In this paper, we revise the established process models for knowledge discovery and propose a new process model for domain-expert-driven interactive knowledge discovery. Furthermore, we present a research infrastructure which is adapted to this new process model and demonstrate how the domain-expert can be deeply integrated even into the highly complex data-mining process and data-exploration tasks. We evaluated this approach in the medical domain for the case of cerebral aneurysms research.

  2. Modelling coupled microbial processes in the subsurface: Model development, verification, evaluation and application

    NASA Astrophysics Data System (ADS)

    Masum, Shakil A.; Thomas, Hywel R.

    2018-06-01

    To study subsurface microbial processes, a coupled model which has been developed within a Thermal-Hydraulic-Chemical-Mechanical (THCM) framework is presented. The work presented here, focuses on microbial transport, growth and decay mechanisms under the influence of multiphase flow and bio-geochemical reactions. In this paper, theoretical formulations and numerical implementations of the microbial model are presented. The model has been verified and also evaluated against relevant experimental results. Simulated results show that the microbial processes have been accurately implemented and their impacts on porous media properties can be predicted either qualitatively or quantitatively or both. The model has been applied to investigate biofilm growth in a sandstone core that is subjected to a two-phase flow and variable pH conditions. The results indicate that biofilm growth (if not limited by substrates) in a multiphase system largely depends on the hydraulic properties of the medium. When the change in porewater pH which occurred due to dissolution of carbon dioxide gas is considered, growth processes are affected. For the given parameter regime, it has been shown that the net biofilm growth is favoured by higher pH; whilst the processes are considerably retarded at lower pH values. The capabilities of the model to predict microbial respiration in a fully coupled multiphase flow condition and microbial fermentation leading to production of a gas phase are also demonstrated.

  3. Measuring societal effects of transdisciplinary research projects: design and application of an evaluation method.

    PubMed

    Walter, Alexander I; Helgenberger, Sebastian; Wiek, Arnim; Scholz, Roland W

    2007-11-01

    Most Transdisciplinary Research (TdR) projects combine scientific research with the building of decision making capacity for the involved stakeholders. These projects usually deal with complex, societally relevant, real-world problems. This paper focuses on TdR projects, which integrate the knowledge of researchers and stakeholders in a collaborative transdisciplinary process through structured methods of mutual learning. Previous research on the evaluation of TdR has insufficiently explored the intended effects of transdisciplinary processes on the real world (societal effects). We developed an evaluation framework for assessing the societal effects of transdisciplinary processes. Outputs (measured as procedural and product-related involvement of the stakeholders), impacts (intermediate effects connecting outputs and outcomes) and outcomes (enhanced decision making capacity) are distinguished as three types of societal effects. Our model links outputs and outcomes of transdisciplinary processes via the impacts using a mediating variables approach. We applied this model in an ex post evaluation of a transdisciplinary process. 84 out of 188 agents participated in a survey. The results show significant mediation effects of the two impacts "network building" and "transformation knowledge". These results indicate an influence of a transdisciplinary process on the decision making capacity of stakeholders, especially through social network building and the generation of knowledge relevant for action.

  4. Challenges in Integrating Nondestructive Evaluation and Finite Element Methods for Realistic Structural Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Zagidulin, Dmitri; Rauser, Richard W.

    2000-01-01

    Capabilities and expertise related to the development of links between nondestructive evaluation (NDE) and finite element analysis (FEA) at Glenn Research Center (GRC) are demonstrated. Current tools to analyze data produced by computed tomography (CT) scans are exercised to help assess the damage state in high temperature structural composite materials. A utility translator was written to convert velocity (an image processing software) STL data file to a suitable CAD-FEA type file. Finite element analyses are carried out with MARC, a commercial nonlinear finite element code, and the analytical results are discussed. Modeling was established by building MSC/Patran (a pre and post processing finite element package) generated model and comparing it to a model generated by Velocity in conjunction with MSC/Patran Graphics. Modeling issues and results are discussed in this paper. The entire process that outlines the tie between the data extracted via NDE and the finite element modeling and analysis is fully described.

  5. The Nursing Leadership Institute program evaluation: a critique

    PubMed Central

    Havaei, Farinaz; MacPhee, Maura

    2015-01-01

    A theory-driven program evaluation was conducted for a nursing leadership program, as a collaborative project between university faculty, the nurses’ union, the provincial Ministry of Health, and its chief nursing officers. A collaborative logic model process was used to engage stakeholders, and mixed methods approaches were used to answer evaluation questions. Despite demonstrated, successful outcomes, the leadership program was not supported with continued funding. This paper examines what happened during the evaluation process: What factors failed to sustain this program? PMID:29355180

  6. Comparisons of in vitro root caries models.

    PubMed

    Wefel, J S; Heilman, J R; Jordan, T H

    1995-01-01

    The purpose of this article is to compare various model systems for the production of in vitro root caries and to assess their ability to simulate the naturally occurring root caries process. Partially saturated buffer models and gel models were evaluated using polarized light microscopy and both qualitative and quantitative microradiography. All model systems showed very similar lesion formation when examined under polarized light. When microradiographs were compared, the systems which contained fluoride, showed clear radiopaque bands within the lesion. The bands, which occurred only in the presence of fluoride, appeared to be due to remineralization. When using an in vitro system that simulates the natural root caries process, it is imperative to understand the components of the particular model, as well as its limitations, and to be aware of the need for more than one evaluative technique.

  7. An evaluation of soil moisture models for countermine application

    NASA Astrophysics Data System (ADS)

    Mason, George L.

    2004-09-01

    The focus of this study is the evaluation of emerging soil moisture models as they apply to infrared, radar, and acoustic sensors within the scope of countermine operations. Physical, chemical, and biological processes changing the signature of the ground are considered. The available models were not run in-house, but were evaluated by the theory by which they were constructed and the supporting documentation. The study was conducted between September and October of 2003 and represents a subset of existing models. The objective was to identify those models suited for simulation, define the general constraints of the models, and summarize the emerging functionalities which would support sensor modeling for mine detection.

  8. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  9. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  10. Application of activated carbon derived from scrap tires for adsorption of Rhodamine B.

    PubMed

    Li, Li; Liu, Shuangxi; Zhu, Tan

    2010-01-01

    Activated carbon derived from solid hazardous waste scrap tires was evaluated as a potential adsorbent for cationic dye removal. The adsorption process with respect to operating parameters was investigated to evaluate the adsorption characteristics of the activated pyrolytic tire char (APTC) for Rhodamine B (RhB). Systematic research including equilibrium, kinetics and thermodynamic studies was performed. The results showed that APTC was a potential adsorbent for RhB with a higher adsorption capacity than most adsorbents. Solution pH and temperature exert significant influence while ionic strength showed little effect on the adsorption process. The adsorption equilibrium data obey Langmuir isotherm and the kinetic data were well described by the pseudo second-order kinetic model. The adsorption process followed intra-particle diffusion model with more than one process affecting the adsorption process. Thermodynamic study confirmed that the adsorption was a physisorption process with spontaneous, endothermic and random characteristics.

  11. Moral judgment as information processing: an integrative review.

    PubMed

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  12. Moral judgment as information processing: an integrative review

    PubMed Central

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  13. Beyond positivist ecology: toward an integrated ecological ethics.

    PubMed

    Norton, Bryan G

    2008-12-01

    A post-positivist understanding of ecological science and the call for an "ecological ethic" indicate the need for a radically new approach to evaluating environmental change. The positivist view of science cannot capture the essence of environmental sciences because the recent work of "reflexive" ecological modelers shows that this requires a reconceptualization of the way in which values and ecological models interact in scientific process. Reflexive modelers are ecological modelers who believe it is appropriate for ecologists to examine the motives for their choices in developing models; this self-reflexive approach opens the door to a new way of integrating values into public discourse and to a more comprehensive approach to evaluating ecological change. This reflexive building of ecological models is introduced through the transformative simile of Aldo Leopold, which shows that learning to "think like a mountain" involves a shift in both ecological modeling and in values and responsibility. An adequate, interdisciplinary approach to ecological valuation, requires a re-framing of the evaluation questions in entirely new ways, i.e., a review of the current status of interdisciplinary value theory with respect to ecological values reveals that neither of the widely accepted theories of environmental value-neither economic utilitarianism nor intrinsic value theory (environmental ethics)-provides a foundation for an ecologically sensitive evaluation process. Thus, a new, ecologically sensitive, and more comprehensive approach to evaluating ecological change would include an examination of the metaphors that motivate the models used to describe environmental change.

  14. Kernel Regression Estimation of Fiber Orientation Mixtures in Diffusion MRI

    PubMed Central

    Cabeen, Ryan P.; Bastin, Mark E.; Laidlaw, David H.

    2016-01-01

    We present and evaluate a method for kernel regression estimation of fiber orientations and associated volume fractions for diffusion MR tractography and population-based atlas construction in clinical imaging studies of brain white matter. This is a model-based image processing technique in which representative fiber models are estimated from collections of component fiber models in model-valued image data. This extends prior work in nonparametric image processing and multi-compartment processing to provide computational tools for image interpolation, smoothing, and fusion with fiber orientation mixtures. In contrast to related work on multi-compartment processing, this approach is based on directional measures of divergence and includes data-adaptive extensions for model selection and bilateral filtering. This is useful for reconstructing complex anatomical features in clinical datasets analyzed with the ball-and-sticks model, and our framework’s data-adaptive extensions are potentially useful for general multi-compartment image processing. We experimentally evaluate our approach with both synthetic data from computational phantoms and in vivo clinical data from human subjects. With synthetic data experiments, we evaluate performance based on errors in fiber orientation, volume fraction, compartment count, and tractography-based connectivity. With in vivo data experiments, we first show improved scan-rescan reproducibility and reliability of quantitative fiber bundle metrics, including mean length, volume, streamline count, and mean volume fraction. We then demonstrate the creation of a multi-fiber tractography atlas from a population of 80 human subjects. In comparison to single tensor atlasing, our multi-fiber atlas shows more complete features of known fiber bundles and includes reconstructions of the lateral projections of the corpus callosum and complex fronto-parietal connections of the superior longitudinal fasciculus I, II, and III. PMID:26691524

  15. A new approach for handling longitudinal count data with zero-inflation and overdispersion: poisson geometric process model.

    PubMed

    Wan, Wai-Yin; Chan, Jennifer S K

    2009-08-01

    For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).

  16. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  17. Implementation of a mezzo-level HOV carpool model for Texas. Final report, September 1986-April 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, J.D.; Mullins, J.A.; Stokes, R.W.

    1989-11-01

    The report presents the results of an evaluation and adaptation of three existing high-occupancy vehicle (HOV) lane carpool demand estimation models for possible use in Houston and other large Texas cities. These models use trip tables, networks and zone structures that are consistent with the regional travel demand modeling process currently in use in Texas. By implementing the HOV carpool models in a structure that is consistent with the regional travel demand modeling process, it is possible to estimate the carpool demand for an HOV facility and to evaluate the effects of the following changes in HOV lane configuration andmore » operating strategies: (1) Effects of additional and/or alternative access points; (2) Effects of extending and HOV lane; and (3) Effects of changing the definition of eligible HOV carpools. The models have produced promising results in test applications in Houston.« less

  18. ANALYZING NUMERICAL ERRORS IN DOMAIN HEAT TRANSPORT MODELS USING THE CVBEM.

    USGS Publications Warehouse

    Hromadka, T.V.; ,

    1985-01-01

    Besides providing an exact solution for steady-state heat conduction processes (Laplace Poisson equations), the CVBEM (complex variable boundary element method) can be used for the numerical error analysis of domain model solutions. For problems where soil water phase change latent heat effects dominate the thermal regime, heat transport can be approximately modeled as a time-stepped steady-state condition in the thawed and frozen regions, respectively. The CVBEM provides an exact solution of the two-dimensional steady-state heat transport problem, and also provides the error in matching the prescribed boundary conditions by the development of a modeling error distribution or an approximative boundary generation. This error evaluation can be used to develop highly accurate CVBEM models of the heat transport process, and the resulting model can be used as a test case for evaluating the precision of domain models based on finite elements or finite differences.

  19. On viewer motivation, unit of analysis, and the VIMAP. Comment on "Move me, astonish me ... delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates" by Matthew Pelowski et al.

    NASA Astrophysics Data System (ADS)

    Tinio, Pablo P. L.

    2017-07-01

    The Vienna Integrated Model of Art Perception (VIMAP; [5]) is the most comprehensive model of the art experience today. The model incorporates bottom-up and top-down cognitive processes and accounts for different outcomes of the art experience, such as aesthetic evaluations, emotions, and physiological and neurological responses to art. In their presentation of the model, Pelowski et al. also present hypotheses that are amenable to empirical testing. These features make the VIMAP an ambitious model that attempts to explain how meaningful, complex, and profound aspects of the art experience come about, which is a significant extension of previous models of the art experience (e.g., [1-3,10]), and which gives the VIMAP good explanatory power.

  20. Evaluating and improving count-based population inference: A case study from 31 years of monitoring Sandhill Cranes

    USGS Publications Warehouse

    Gerber, Brian D.; Kendall, William L.

    2017-01-01

    Monitoring animal populations can be difficult. Limited resources often force monitoring programs to rely on unadjusted or smoothed counts as an index of abundance. Smoothing counts is commonly done using a moving-average estimator to dampen sampling variation. These indices are commonly used to inform management decisions, although their reliability is often unknown. We outline a process to evaluate the biological plausibility of annual changes in population counts and indices from a typical monitoring scenario and compare results with a hierarchical Bayesian time series (HBTS) model. We evaluated spring and fall counts, fall indices, and model-based predictions for the Rocky Mountain population (RMP) of Sandhill Cranes (Antigone canadensis) by integrating juvenile recruitment, harvest, and survival into a stochastic stage-based population model. We used simulation to evaluate population indices from the HBTS model and the commonly used 3-yr moving average estimator. We found counts of the RMP to exhibit biologically unrealistic annual change, while the fall population index was largely biologically realistic. HBTS model predictions suggested that the RMP changed little over 31 yr of monitoring, but the pattern depended on assumptions about the observational process. The HBTS model fall population predictions were biologically plausible if observed crane harvest mortality was compensatory up to natural mortality, as empirical evidence suggests. Simulations indicated that the predicted mean of the HBTS model was generally a more reliable estimate of the true population than population indices derived using a moving 3-yr average estimator. Practitioners could gain considerable advantages from modeling population counts using a hierarchical Bayesian autoregressive approach. Advantages would include: (1) obtaining measures of uncertainty; (2) incorporating direct knowledge of the observational and population processes; (3) accommodating missing years of data; and (4) forecasting population size.

  1. The Improvement of the Closed Bounded Volume (CBV) Evaluation Methods to Compute a Feasible Rough Machining Area Based on Faceted Models

    NASA Astrophysics Data System (ADS)

    Hadi Sutrisno, Himawan; Kiswanto, Gandjar; Istiyanto, Jos

    2017-06-01

    The rough machining is aimed at shaping a workpiece towards to its final form. This process takes up a big proportion of the machining time due to the removal of the bulk material which may affect the total machining time. In certain models, the rough machining has limitations especially on certain surfaces such as turbine blade and impeller. CBV evaluation is one of the concepts which is used to detect of areas admissible in the process of machining. While in the previous research, CBV area detection used a pair of normal vectors, in this research, the writer simplified the process to detect CBV area with a slicing line for each point cloud formed. The simulation resulted in three steps used for this method and they are: 1. Triangulation from CAD design models, 2. Development of CC point from the point cloud, 3. The slicing line method which is used to evaluate each point cloud position (under CBV and outer CBV). The result of this evaluation method can be used as a tool for orientation set-up on each CC point position of feasible areas in rough machining.

  2. Modeling and Analysis of Process Parameters for Evaluating Shrinkage Problems During Plastic Injection Molding of a DVD-ROM Cover

    NASA Astrophysics Data System (ADS)

    Öktem, H.

    2012-01-01

    Plastic injection molding plays a key role in the production of high-quality plastic parts. Shrinkage is one of the most significant problems of a plastic part in terms of quality in the plastic injection molding. This article focuses on the study of the modeling and analysis of the effects of process parameters on the shrinkage by evaluating the quality of the plastic part of a DVD-ROM cover made with Acrylonitrile Butadiene Styrene (ABS) polymer material. An effective regression model was developed to determine the mathematical relationship between the process parameters (mold temperature, melt temperature, injection pressure, injection time, and cooling time) and the volumetric shrinkage by utilizing the analysis data. Finite element (FE) analyses designed by Taguchi (L27) orthogonal arrays were run in the Moldflow simulation program. Analysis of variance (ANOVA) was then performed to check the adequacy of the regression model and to determine the effect of the process parameters on the shrinkage. Experiments were conducted to control the accuracy of the regression model with the FE analyses obtained from Moldflow. The results show that the regression model agrees very well with the FE analyses and the experiments. From this, it can be concluded that this study succeeded in modeling the shrinkage problem in our application.

  3. New atmospheric sensor analysis study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1989-01-01

    The functional capabilities of the ESAD Research Computing Facility are discussed. The system is used in processing atmospheric measurements which are used in the evaluation of sensor performance, conducting design-concept simulation studies, and also in modeling the physical and dynamical nature of atmospheric processes. The results may then be evaluated to furnish inputs into the final design specifications for new space sensors intended for future Spacelab, Space Station, and free-flying missions. In addition, data gathered from these missions may subsequently be analyzed to provide better understanding of requirements for numerical modeling of atmospheric phenomena.

  4. Evaluation and Prediction of Water Resources Based on AHP

    NASA Astrophysics Data System (ADS)

    Li, Shuai; Sun, Anqi

    2017-01-01

    Nowadays, the shortage of water resources is a threat to us. In order to solve the problem of water resources restricted by varieties of factors, this paper establishes a water resources evaluation index model (WREI), which adopts the fuzzy comprehensive evaluation (FCE) based on analytic hierarchy process (AHP) algorithm. After considering influencing factors of water resources, we ignore secondary factors and then hierarchical approach the main factors according to the class, set up a three-layer structure. The top floor is for WREI. Using analytic hierarchy process (AHP) to determine weight first, and then use fuzzy judgment to judge target, so the comprehensive use of the two algorithms reduce the subjective influence of AHP and overcome the disadvantages of multi-level evaluation. To prove the model, we choose India as a target region. On the basis of water resources evaluation index model, we use Matlab and combine grey prediction with linear prediction to discuss the ability to provide clean water in India and the trend of India’s water resources changing in the next 15 years. The model with theoretical support and practical significance will be of great help to provide reliable data support and reference for us to get plans to improve water quality.

  5. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  6. Statistical modeling for visualization evaluation through data fusion.

    PubMed

    Chen, Xiaoyu; Jin, Ran

    2017-11-01

    There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Real-time computing platform for spiking neurons (RT-spike).

    PubMed

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieder, William R.; Allison, Steven D.; Davidson, Eric A.

    Microbes influence soil organic matter (SOM) decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) may make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here, we review the diversity, advantages, and pitfalls of simulating soilmore » biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models we suggest: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.« less

  9. Using manufacturing simulators to evaluate important processing decisions in the furniture and cabinet industries

    Treesearch

    Janice K. Wiedenbeck; Philip A. Araman

    1995-01-01

    We've been telling the wood industry about our process simulation modeling research and development work for several years. We've demonstrated our crosscut-first and rip-first rough mill simulation and animation models. Weâve advised companies on how they could use simulation modeling to help make critically important, pending decisions related to mill layout...

  10. Understanding Collaboration: A Formative Process Evaluation of a State-Funded School-University Partnership

    ERIC Educational Resources Information Center

    Corbin, J. Hope; Chu, Marilyn; Carney, Joanne; Donnelly, Susan; Clancy, Andrea

    2017-01-01

    School-university partnerships are widely promoted yet little is known about what contributes to their effectiveness. This paper presents a participatory formative evaluation of a state-funded school-university partnership. The study employed an empirically derived systems model--the Bergen Model of Collaborative Functioning (BMCF)--as the…

  11. Models Matter--The Final Report of the National Longitudinal Evaluation of Comprehensive School Reform

    ERIC Educational Resources Information Center

    Aladjem, Daniel K.; LeFloch, Kerstin Carlson; Zhang, Yu; Kurki, Anja; Boyle, Andrea; Taylor, James E.; Herrmann, Suzannah; Uekawa, Kazuaki; Thomsen, Kerri; Fashola, Olatokunbo

    2006-01-01

    The National Longitudinal Evaluation of Comprehensive School Reform (NLECSR) is a quantitative and qualitative study of behavior, decisions, processes, and outcomes. It employs a quasi-experimental design with matched treatment and comparison schools. NLECSR seeks to determine the effects of CSR models on student achievement in about 650…

  12. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  13. A Conceptual Model and Assessment Template for Capacity Evaluation in Adult Guardianship

    ERIC Educational Resources Information Center

    Moye, Jennifer; Butz, Steven W.; Marson, Daniel C.; Wood, Erica

    2007-01-01

    Purpose: We develop a conceptual model and associated assessment template that is usable across state jurisdictions for evaluating the independent-living capacity of older adults in guardianship proceedings. Design and Methods: We used an iterative process in which legal provisions for guardianship and prevailing clinical practices for capacity…

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Harvey, Julia B.

    The International Atomic Energy Agency State Evaluation Process: The Role of Information Analysis in Reaching Safeguards Conclusions (Mathews et al. 2008), several examples of nonproliferation models using analytical software were developed that may assist the IAEA with collecting, visualizing, analyzing, and reporting information in support of the State Evaluation Process. This paper focuses on one of the examples a set of models developed in the Proactive Scenario Production, Evidence Collection, and Testing (ProSPECT) software that evaluates the status and nature of a state’s nuclear activities. The models use three distinct subject areas to perform this assessment: the presence of nuclearmore » activities, the consistency of those nuclear activities with national nuclear energy goals, and the geopolitical context in which those nuclear activities are taking place. As a proof-of-concept for the models, a crude case study was performed. The study, which attempted to evaluate the nuclear activities taking place in Syria prior to September 2007, yielded illustrative, yet inconclusive, results. Due to the inconclusive nature of the case study results, changes that may improve the model’s efficiency and accuracy are proposed.« less

  15. A Developmental Perspective on Peer Rejection, Deviant Peer Affiliation, and Conduct Problems Among Youth.

    PubMed

    Chen, Diane; Drabick, Deborah A G; Burgers, Darcy E

    2015-12-01

    Peer rejection and deviant peer affiliation are linked consistently to the development and maintenance of conduct problems. Two proposed models may account for longitudinal relations among these peer processes and conduct problems: the (a) sequential mediation model, in which peer rejection in childhood and deviant peer affiliation in adolescence mediate the link between early externalizing behaviors and more serious adolescent conduct problems; and (b) parallel process model, in which peer rejection and deviant peer affiliation are considered independent processes that operate simultaneously to increment risk for conduct problems. In this review, we evaluate theoretical models and evidence for associations among conduct problems and (a) peer rejection and (b) deviant peer affiliation. We then consider support for the sequential mediation and parallel process models. Next, we propose an integrated model incorporating both the sequential mediation and parallel process models. Future research directions and implications for prevention and intervention efforts are discussed.

  16. A Developmental Perspective on Peer Rejection, Deviant Peer Affiliation, and Conduct Problems among Youth

    PubMed Central

    Chen, Diane; Drabick, Deborah A. G.; Burgers, Darcy E.

    2015-01-01

    Peer rejection and deviant peer affiliation are linked consistently to the development and maintenance of conduct problems. Two proposed models may account for longitudinal relations among these peer processes and conduct problems: the (a) sequential mediation model, in which peer rejection in childhood and deviant peer affiliation in adolescence mediate the link between early externalizing behaviors and more serious adolescent conduct problems; and (b) parallel process model, in which peer rejection and deviant peer affiliation are considered independent processes that operate simultaneously to increment risk for conduct problems. In this review, we evaluate theoretical models and evidence for associations among conduct problems and (a) peer rejection and (b) deviant peer affiliation. We then consider support for the sequential mediation and parallel process models. Next, we propose an integrated model incorporating both the sequential mediation and parallel process models. Future research directions and implications for prevention and intervention efforts are discussed. PMID:25410430

  17. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  18. Mountain-Scale Coupled Processes (TH/THC/THM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    The purpose of this Model Report is to document the development of the Mountain-Scale Thermal-Hydrological (TH), Thermal-Hydrological-Chemical (THC), and Thermal-Hydrological-Mechanical (THM) Models and evaluate the effects of coupled TH/THC/THM processes on mountain-scale UZ flow at Yucca Mountain, Nevada. This Model Report was planned in ''Technical Work Plan (TWP) for: Performance Assessment Unsaturated Zone'' (BSC 2002 [160819], Section 1.12.7), and was developed in accordance with AP-SIII.10Q, Models. In this Model Report, any reference to ''repository'' means the nuclear waste repository at Yucca Mountain, and any reference to ''drifts'' means the emplacement drifts at the repository horizon. This Model Report provides themore » necessary framework to test conceptual hypotheses for analyzing mountain-scale hydrological/chemical/mechanical changes and predict flow behavior in response to heat release by radioactive decay from the nuclear waste repository at the Yucca Mountain site. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH Model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH Model captures mountain-scale three dimensional (3-D) flow effects, including lateral diversion at the PTn/TSw interface and mountain-scale flow patterns. The Mountain-Scale THC Model evaluates TH effects on water and gas chemistry, mineral dissolution/precipitation, and the resulting impact to UZ hydrological properties, flow and transport. The THM Model addresses changes in permeability due to mechanical and thermal disturbances in stratigraphic units above and below the repository host rock. The Mountain-Scale THM Model focuses on evaluating the changes in 3-D UZ flow fields arising out of thermal stress and rock deformation during and after the thermal periods.« less

  19. A survey of Applied Psychological Services' models of the human operator

    NASA Technical Reports Server (NTRS)

    Siegel, A. I.; Wolf, J. J.

    1979-01-01

    A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.

  20. ON JOINT DETERMINISTIC GRID MODELING AND SUB-GRID VARIABILITY CONCEPTUAL FRAMEWORK FOR MODEL EVALUATION

    EPA Science Inventory

    The general situation, (but exemplified in urban areas), where a significant degree of sub-grid variability (SGV) exists in grid models poses problems when comparing gridbased air quality modeling results with observations. Typically, grid models ignore or parameterize processes ...

  1. An Evaluation Tool for CONUS-Scale Estimates of Components of the Water Balance

    NASA Astrophysics Data System (ADS)

    Saxe, S.; Hay, L.; Farmer, W. H.; Markstrom, S. L.; Kiang, J. E.

    2016-12-01

    Numerous research groups are independently developing data products to represent various components of the water balance (e.g. runoff, evapotranspiration, recharge, snow water equivalent, soil moisture, and climate) at the scale of the conterminous United States. These data products are derived from a range of sources, including direct measurement, remotely-sensed measurement, and statistical and deterministic model simulations. An evaluation tool is needed to compare these data products and the components of the water balance they contain in order to identify the gaps in the understanding and representation of continental-scale hydrologic processes. An ideal tool will be an objective, universally agreed upon, framework to address questions related to closing the water balance. This type of generic, model agnostic evaluation tool would facilitate collaboration amongst different hydrologic research groups and improve modeling capabilities with respect to continental-scale water resources. By adopting a comprehensive framework to consider hydrologic modeling in the context of a complete water balance, it is possible to identify weaknesses in process modeling, data product representation and regional hydrologic variation. As part of its National Water Census initiative, the U.S. Geological survey is facilitating this dialogue to developing prototype evaluation tools.

  2. Evaluating synoptic systems in the CMIP5 climate models over the Australian region

    NASA Astrophysics Data System (ADS)

    Gibson, Peter B.; Uotila, Petteri; Perkins-Kirkpatrick, Sarah E.; Alexander, Lisa V.; Pitman, Andrew J.

    2016-10-01

    Climate models are our principal tool for generating the projections used to inform climate change policy. Our confidence in projections depends, in part, on how realistically they simulate present day climate and associated variability over a range of time scales. Traditionally, climate models are less commonly assessed at time scales relevant to daily weather systems. Here we explore the utility of a self-organizing maps (SOMs) procedure for evaluating the frequency, persistence and transitions of daily synoptic systems in the Australian region simulated by state-of-the-art global climate models. In terms of skill in simulating the climatological frequency of synoptic systems, large spread was observed between models. A positive association between all metrics was found, implying that relative skill in simulating the persistence and transitions of systems is related to skill in simulating the climatological frequency. Considering all models and metrics collectively, model performance was found to be related to model horizontal resolution but unrelated to vertical resolution or representation of the stratosphere. In terms of the SOM procedure, the timespan over which evaluation was performed had some influence on model performance skill measures, as did the number of circulation types examined. These findings have implications for selecting models most useful for future projections over the Australian region, particularly for projections related to synoptic scale processes and phenomena. More broadly, this study has demonstrated the utility of the SOMs procedure in providing a process-based evaluation of climate models.

  3. Multi-indicator Evaluation System for Broadsword, Rod, Sword and Spear Athletes Based on Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Luo, Lin

    2017-08-01

    In the practical selection of Wushu athletes, the objective evaluation of the level of athletes lacks sufficient technical indicators and often relies on the coach’s subjective judgments. It is difficult to accurately and objectively reflect the overall quality of the athletes without a fully quantified indicator system, thus affecting the level improvement of Wushu competition. The analytic hierarchy process (AHP) is a systemic analysis method combining quantitative and qualitative analysis. This paper realizes structured, hierarchized and quantified decision-making process of evaluating broadsword, rod, sword and spear athletes in the AHP. Combing characteristics of the athletes, analysis is carried out from three aspects, i.e., the athlete’s body shape, physical function and sports quality and 18 specific evaluation indicators established, and then combining expert advice and practical experience, pairwise comparison matrix is determined, and then the weight of the indicators and comprehensive evaluation coefficient are obtained to establish the evaluation model for the athletes, thus providing a scientific theoretical basis for the selection of Wushu athletes. The evaluation model proposed in this paper has realized the evaluation system of broadsword, rod, sword and spear athletes, which has effectively improved the scientific level of Wushu athletes selection in practical application.

  4. A Case Study of Group Processes and Student Evaluation of Teaching

    ERIC Educational Resources Information Center

    Mortenson, Kristian G.; Sathe, Richard S.

    2017-01-01

    This paper documents a case study undertaken to understand the effect of group processes on student evaluation of teaching (SET). The study used interviews to investigate the experiences of students in a cohort model Master of Science in Accountancy degree program and how those experiences influenced SET. The cohort served as an extreme example in…

  5. Evaluating the agreement between measurements and models of net ecosystem exchange at different times and timescales using wavelet coherence: an example using data from the North American Carbon Program Site-Level Interim Synthesis

    Treesearch

    P.C. Stoy; M.C. Dietze; A.D. Richardson; R. Vargas; A.G. Barr; R.S. Anderson; M.A. Arain; I.T. Baker; T.A. Black; J.M. Chen; R.B. Cook; C.M. Gough; R.F. Grant; D.Y. Hollinger; R.C. Izaurralde; C.J. Kucharik; P. Lafleur; B.E. Law; S. Liu; E. Lokupitiya; Y. Luo; J. W. Munger; C. Peng; B. Poulter; D.T. Price; D. M. Ricciuto; W. J. Riley; A. K. Sahoo; K. Schaefer; C.R. Schwalm; H. Tian; H. Verbeeck; E. Weng

    2013-01-01

    Earth system processes exhibit complex patterns across time, as do the models that seek to replicate these processes. Model output may or may not be significantly related to observations at different times and on different frequencies. Conventional model diagnostics provide an aggregate view of model-data agreement, but usually do not identify the time and frequency...

  6. Determining the potential productivity of food crops in controlled environments

    NASA Technical Reports Server (NTRS)

    Bugbee, Bruce

    1992-01-01

    The quest to determine the maximum potential productivity of food crops is greatly benefitted by crop growth models. Many models have been developed to analyze and predict crop growth in the field, but it is difficult to predict biological responses to stress conditions. Crop growth models for the optimal environments of a Controlled Environment Life Support System (CELSS) can be highly predictive. This paper discusses the application of a crop growth model to CELSS; the model is used to evaluate factors limiting growth. The model separately evaluates the following four physiological processes: absorption of PPF by photosynthetic tissue, carbon fixation (photosynthesis), carbon use (respiration), and carbon partitioning (harvest index). These constituent processes determine potentially achievable productivity. An analysis of each process suggests that low harvest index is the factor most limiting to yield. PPF absorption by plant canopies and respiration efficiency are also of major importance. Research concerning productivity in a CELSS should emphasize: (1) the development of gas exchange techniques to continuously monitor plant growth rates and (2) environmental techniques to reduce plant height in communities.

  7. Evaluation of the hydrological flow paths in a gravel bed filter modeling a horizontal subsurface flow wetland by using a multi-tracer experiment.

    PubMed

    Birkigt, Jan; Stumpp, Christine; Małoszewski, Piotr; Nijenhuis, Ivonne

    2018-04-15

    In recent years, constructed wetland systems have become into focus as means of cost-efficient organic contaminant management. Wetland systems provide a highly reactive environment in which several removal pathways of organic chemicals may be present at the same time; however, specific elimination processes and hydraulic conditions are usually separately investigated and thus not fully understood. The flow system in a three dimensional pilot-scale horizontal subsurface constructed wetland was investigated applying a multi-tracer test combined with a mathematical model to evaluate the flow and transport processes. The results indicate the existence of a multiple flow system with two distinct flow paths through the gravel bed and a preferential flow at the bottom transporting 68% of tracer mass resulting from the inflow design of the model wetland system. There the removal of main contaminant chlorobenzene was up to 52% based on different calculation approaches. Determined retention times in the range of 22d to 32.5d the wetland has a heterogeneous flow pattern. Differences between simulated and measured tracer concentrations in the upper sediment indicate diffusion dominated processes due to stagnant water zones. The tracer study combining experimental evaluation with mathematical modeling demonstrated the complexity of flow and transport processes in the constructed wetlands which need to be taken into account during interpretation of the determining attenuation processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.

  9. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Gonnenthal; N. Spyoher

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THCmore » Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  10. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Sonnenthale

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THCmore » seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  11. Performance and evaluation of real-time multicomputer control systems

    NASA Technical Reports Server (NTRS)

    Shin, K. G.

    1983-01-01

    New performance measures, detailed examples, modeling of error detection process, performance evaluation of rollback recovery methods, experiments on FTMP, and optimal size of an NMR cluster are discussed.

  12. Containerless processing of undercooled melts

    NASA Technical Reports Server (NTRS)

    Shong, D. S.; Graves, J. A.; Ujiie, Y.; Perepezko, J. H.

    1987-01-01

    Containerless drop tube processing allows for significant levels of liquid undercooling through control of parameters such as sample size, surface coating and cooling rate. A laboratory scale (3 m) drop tube has been developed which allows the undercooling and solidification behavior of powder samples to be evaluated under low gravity free-fall conditions. The level of undercooling obtained in an InSb-Sb eutectic alloy has been evaluated by comparing the eutectic spacing in drop tube samples with a spacing/undercooling relationship established using thermal analysis techniques. Undercoolings of 0.17 and 0.23 T(e) were produced by processing under vacuum and He gas conditions respectively. Alternatively, the formation of an amorphous phase in a Ni-Nb eutectic alloy indicates that undercooling levels of approximately 500 C were obtained by drop tube processing. The influence of droplet size and gas environment on undercooling behavior in the Ni-Nb eutectic was evaluated through their effect on the amorphous/crystalline phase ratio. To supplement the structural analysis, heat flow modeling has been developed to describe the undercooling history during drop tube processing, and the model has been tested experimentally.

  13. Evaluation of Model Recognition for Grammar-Based Automatic 3d Building Model Reconstruction

    NASA Astrophysics Data System (ADS)

    Yu, Qian; Helmholz, Petra; Belton, David

    2016-06-01

    In recent years, 3D city models are in high demand by many public and private organisations, and the steadily growing capacity in both quality and quantity are increasing demand. The quality evaluation of these 3D models is a relevant issue both from the scientific and practical points of view. In this paper, we present a method for the quality evaluation of 3D building models which are reconstructed automatically from terrestrial laser scanning (TLS) data based on an attributed building grammar. The entire evaluation process has been performed in all the three dimensions in terms of completeness and correctness of the reconstruction. Six quality measures are introduced to apply on four datasets of reconstructed building models in order to describe the quality of the automatic reconstruction, and also are assessed on their validity from the evaluation point of view.

  14. WEPP Model applications for evaluations of best management practices

    Treesearch

    D. C. Flanagan; W. J. Elliott; J. R. Frankenberger; C. Huang

    2010-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based erosion prediction technology for application to small watersheds and hillslope profiles, under agricultural, forested, rangeland, and other land management conditions. Developed by the United States Department of Agriculture (USDA) over the past 25 years, WEPP simulates many of the physical processes...

  15. ATMOSPHERIC AMMONIA EMISSIONS FROM THE LIVESTOCK SECTOR: DEVELOPMENT AND EVALUATION OF A PROCESS-BASED MODELING APPROACH

    EPA Science Inventory

    We propose multi-faceted research to enhance our understanding of NH3 emissions from livestock feeding operations. A process-based emissions modeling approach will be used, and we will investigate ammonia emissions from the scale of the individual farm out to impacts on region...

  16. Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter

    EPA Science Inventory

    Numerical modeling of inorganic aerosol processes is useful in air quality management, but comprehensive evaluation of modeled aerosol processes is rarely possible due to the lack of comprehensive datasets. During the Nitrogen, Aerosol Composition, and Halogens on a Tall Tower (N...

  17. Design and evaluation of a parametric model for cardiac sounds.

    PubMed

    Ibarra-Hernández, Roilhi F; Alonso-Arévalo, Miguel A; Cruz-Gutiérrez, Alejandro; Licona-Chávez, Ana L; Villarreal-Reyes, Salvador

    2017-10-01

    Heart sound analysis plays an important role in the auscultative diagnosis process to detect the presence of cardiovascular diseases. In this paper we propose a novel parametric heart sound model that accurately represents normal and pathological cardiac audio signals, also known as phonocardiograms (PCG). The proposed model considers that the PCG signal is formed by the sum of two parts: one of them is deterministic and the other one is stochastic. The first part contains most of the acoustic energy. This part is modeled by the Matching Pursuit (MP) algorithm, which performs an analysis-synthesis procedure to represent the PCG signal as a linear combination of elementary waveforms. The second part, also called residual, is obtained after subtracting the deterministic signal from the original heart sound recording and can be accurately represented as an autoregressive process using the Linear Predictive Coding (LPC) technique. We evaluate the proposed heart sound model by performing subjective and objective tests using signals corresponding to different pathological cardiac sounds. The results of the objective evaluation show an average Percentage of Root-Mean-Square Difference of approximately 5% between the original heart sound and the reconstructed signal. For the subjective test we conducted a formal methodology for perceptual evaluation of audio quality with the assistance of medical experts. Statistical results of the subjective evaluation show that our model provides a highly accurate approximation of real heart sound signals. We are not aware of any previous heart sound model rigorously evaluated as our proposal. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Chemical kinetics and photochemical data for use in stratospheric modeling evaluation Number 8

    NASA Technical Reports Server (NTRS)

    Demore, W. B.; Molina, M. J.; Sander, S. P.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.

    1987-01-01

    This is the eighth in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The primary application of the data is in the modeling of stratospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena. Copies of this evaluation are available from the Jet Propulsion Laboratory, Documentation Section, 111-116B, California Institute of Technology, Pasadena, California, 91109.

  19. Evaluating the Credibility of Transport Processes in Simulations of Ozone Recovery using the Global Modeling Initiative Three-dimensional Model

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.

    2004-01-01

    The Global Modeling Initiative (GMI) has integrated two 36-year simulations of an ozone recovery scenario with an offline chemistry and tra nsport model using two different meteorological inputs. Physically ba sed diagnostics, derived from satellite and aircraft data sets, are d escribed and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barri er formation in the subtropics and polar regions, and extratropical w ave-driven transport. Some diagnostics are especially relevant to sim ulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The global temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of me teorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a g eneral circulation model (GMI(GCM)) showed a very good residual circulation in the tropics and Northern Hemisphere. The simulation with inp ut from a data assimilation system (GMI(DAS)) performed better in the midlatitudes than it did at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GML(GCM) has greater fidelity throughout the stratosphere tha n it does in the GMI(DAS)

  20. Integrated Model for E-Learning Acceptance

    NASA Astrophysics Data System (ADS)

    Ramadiani; Rodziah, A.; Hasan, S. M.; Rusli, A.; Noraini, C.

    2016-01-01

    E-learning is not going to work if the system is not used in accordance with user needs. User Interface is very important to encourage using the application. Many theories had discuss about user interface usability evaluation and technology acceptance separately, actually why we do not make it correlation between interface usability evaluation and user acceptance to enhance e-learning process. Therefore, the evaluation model for e-learning interface acceptance is considered important to investigate. The aim of this study is to propose the integrated e-learning user interface acceptance evaluation model. This model was combined some theories of e-learning interface measurement such as, user learning style, usability evaluation, and the user benefit. We formulated in constructive questionnaires which were shared at 125 English Language School (ELS) students. This research statistics used Structural Equation Model using LISREL v8.80 and MANOVA analysis.

  1. The evaluator as technical assistant: A model for systemic reform support

    NASA Astrophysics Data System (ADS)

    Century, Jeanne Rose

    This study explored evaluation of systemic reform. Specifically, it focused on the evaluation of a systemic effort to improve K-8 science, mathematics and technology education. The evaluation was of particular interest because it used both technical assistance and evaluation strategies. Through studying the combination of these roles, this investigation set out to increase understanding of potentially new evaluator roles, distinguish important characteristics of the evaluator/project participant relationship, and identify how these roles and characteristics contribute to effective evaluation of systemic science education reform. This qualitative study used interview, document analysis, and participant observation as methods of data collection. Interviews were conducted with project leaders, project participants, and evaluators and focused on the evaluation strategies and process, the use of the evaluation, and technical assistance. Documents analyzed included transcripts of evaluation team meetings and reports, memoranda and other print materials generated by the project leaders and the evaluators. Data analysis consisted of analytic and interpretive procedures consistent with the qualitative data collected and entailed a combined process of coding transcripts of interviews and meetings, field notes, and other documents; analyzing and organizing findings; writing of reflective and analytic memos; and designing and diagramming conceptual relationships. The data analysis resulted in the development of the Multi-Function Model for Systemic Reform Support. This model organizes systemic reform support into three functions: evaluation, technical assistance, and a third, named here as "systemic perspective." These functions work together to support the project's educational goals as well as a larger goal--building capacity in project participants. This model can now serve as an informed starting point or "blueprint" for strategically supporting systemic reform.

  2. Modelling the impacts of pests and diseases on agricultural systems.

    PubMed

    Donatelli, M; Magarey, R D; Bregaglio, S; Willocquet, L; Whish, J P M; Savary, S

    2017-07-01

    The improvement and application of pest and disease models to analyse and predict yield losses including those due to climate change is still a challenge for the scientific community. Applied modelling of crop diseases and pests has mostly targeted the development of support capabilities to schedule scouting or pesticide applications. There is a need for research to both broaden the scope and evaluate the capabilities of pest and disease models. Key research questions not only involve the assessment of the potential effects of climate change on known pathosystems, but also on new pathogens which could alter the (still incompletely documented) impacts of pests and diseases on agricultural systems. Yield loss data collected in various current environments may no longer represent a adequate reference to develop tactical, decision-oriented, models for plant diseases and pests and their impacts, because of the ongoing changes in climate patterns. Process-based agricultural simulation modelling, on the other hand, appears to represent a viable methodology to estimate the impacts of these potential effects. A new generation of tools based on state-of-the-art knowledge and technologies is needed to allow systems analysis including key processes and their dynamics over appropriate suitable range of environmental variables. This paper offers a brief overview of the current state of development in coupling pest and disease models to crop models, and discusses technical and scientific challenges. We propose a five-stage roadmap to improve the simulation of the impacts caused by plant diseases and pests; i) improve the quality and availability of data for model inputs; ii) improve the quality and availability of data for model evaluation; iii) improve the integration with crop models; iv) improve the processes for model evaluation; and v) develop a community of plant pest and disease modelers.

  3. Approximate Model of Zone Sedimentation

    NASA Astrophysics Data System (ADS)

    Dzianik, František

    2011-12-01

    The process of zone sedimentation is affected by many factors that are not possible to express analytically. For this reason, the zone settling is evaluated in practice experimentally or by application of an empirical mathematical description of the process. The paper presents the development of approximate model of zone settling, i.e. the general function which should properly approximate the behaviour of the settling process within its entire range and at the various conditions. Furthermore, the specification of the model parameters by the regression analysis of settling test results is shown. The suitability of the model is reviewed by graphical dependencies and by statistical coefficients of correlation. The approximate model could by also useful on the simplification of process design of continual settling tanks and thickeners.

  4. Integrated Main Propulsion System Performance Reconstruction Process/Models

    NASA Technical Reports Server (NTRS)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  5. Community-based health care for indigenous women in Mexico: a qualitative evaluation

    PubMed Central

    2014-01-01

    Introduction Indigenous women in Mexico represent a vulnerable population in which three kinds of discrimination converge (ethnicity, gender and class), having direct repercussions on health status. The discrimination and inequity in health care settings brought this population to the fore as a priority group for institutional action. The objective of this study was to evaluate the processes and performance of the “Casa de la Mujer Indígena”, a community based project for culturally and linguistically appropriate service delivery for indigenous women. The evaluation summarizes perspectives from diverse stakeholders involved in the implementation of the model, including users, local authorities, and institutional representatives. Methods The study covered five Casas implementation sites located in four Mexican states. A qualitative process evaluation focused on systematically analyzing the Casas project processes and performance was conducted using archival information and semi-structured interviews. Sixty-two interviews were conducted, and grounded theory approach was applied for data analysis. Results Few similarities were observed between the proposed model of service delivery and its implementation in diverse locations, signaling discordant operating processes. Evidence gathered from Casas personnel highlighted their ability to detect obstetric emergencies and domestic violence cases, as well as contribute to the empowerment of women in the indigenous communities served by the project. These themes directly translated to increases in the reporting of abuse and referrals for obstetric emergencies. Conclusions The model’s cultural and linguistic competency, and contributions to increased referrals for obstetric emergencies and abuse are notable successes. The flexibility and community-based nature of the model has allowed it to be adapted to the particularities of diverse indigenous contexts. Local, culturally appropriate implementation has been facilitated by the fact that the Casas have been implemented with local leadership and local women have taken ownership. Users express overall satisfaction with service delivery, while providing constructive feedback for the improvement of existing Casas, as well as more cost-effective implementation of the model in new sites. Integration of user’s input obtained from this process evaluation into future planning will undoubtedly increase buy-in. The Casas model is pertinent and viable to other contexts where indigenous women experience disparities in care. PMID:24393517

  6. Meta-analysis for genome-wide association studies using case-control design: application and practice

    PubMed Central

    2016-01-01

    This review aimed to arrange the process of a systematic review of genome-wide association studies in order to practice and apply a genome-wide meta-analysis (GWMA). The process has a series of five steps: searching and selection, extraction of related information, evaluation of validity, meta-analysis by type of genetic model, and evaluation of heterogeneity. In contrast to intervention meta-analyses, GWMA has to evaluate the Hardy–Weinberg equilibrium (HWE) in the third step and conduct meta-analyses by five potential genetic models, including dominant, recessive, homozygote contrast, heterozygote contrast, and allelic contrast in the fourth step. The ‘genhwcci’ and ‘metan’ commands of STATA software evaluate the HWE and calculate a summary effect size, respectively. A meta-regression using the ‘metareg’ command of STATA should be conducted to evaluate related factors of heterogeneities. PMID:28092928

  7. Development of the e-Baby serious game with regard to the evaluation of oxygenation in preterm babies: contributions of the emotional design.

    PubMed

    Fonseca, Luciana Mara Monti; Dias, Danielle Monteiro Vilela; Góes, Fernanda Dos Santos Nogueira; Seixas, Carlos Alberto; Scochi, Carmen Gracinda Silvan; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2014-09-01

    The present study aimed to describe the development process of a serious game that enables users to evaluate the respiratory process in a preterm infant based on an emotional design model. The e-Baby serious game was built to feature the simulated environment of an incubator, in which the user performs a clinical evaluation of the respiratory process in a virtual preterm infant. The user learns about the preterm baby's history, chooses the tools for the clinical evaluation, evaluates the baby, and determines whether his/her evaluation is appropriate. The e-Baby game presents phases that contain respiratory process impairments of higher or lower complexity in the virtual preterm baby. Included links give the user the option of recording the entire evaluation procedure and sharing his/her performance on a social network. e-Baby integrates a Clinical Evaluation of the Preterm Baby course in the Moodle virtual environment. This game, which evaluates the respiratory process in preterm infants, could support a more flexible, attractive, and interactive teaching and learning process that includes simulations with features very similar to neonatal unit realities, thus allowing more appropriate training for clinical oxygenation evaluations in at-risk preterm infants. e-Baby allows advanced user-technology-educational interactions because it requires active participation in the process and is emotionally integrated.

  8. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, Pedro J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  9. A simple hyperbolic model for communication in parallel processing environments

    NASA Technical Reports Server (NTRS)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  10. Evaluate transport processes in MERRA driven chemical transport models using updated 222Rn emission inventories and global observations

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Liu, H.; Crawford, J. H.; Fairlie, T. D.; Chen, G.; Chambers, S. D.; Kang, C. H.; Williams, A. G.; Zhang, K.; Considine, D. B.; Payer Sulprizio, M.; Yantosca, R.

    2015-12-01

    Convective and synoptic processes play a major role in determining the transport and distribution of trace gases and aerosols in the troposphere. The representation of these processes in global models (at ~100-1000 km horizontal resolution) is challenging, because convection is a sub-grid process and needs to be parameterized, while synoptic processes are close to the grid scale. Depending on the parameterization schemes used in climate models, the role of convection in transporting trace gases and aerosols may vary from model to model. 222Rn is a chemically inert and radioactive gas constantly emitted from soil and has a half-life (3.8 days) comparable to synoptic timescale, which makes it an effective tracer for convective and synoptic transport. In this study, we evaluate the convective and synoptic transport in two chemical transport models (GMI and GEOS-Chem), both driven by the NASA's MERRA reanalysis. Considering the uncertainties in 222Rn emissions, we incorporate two more recent scenarios with regionally varying 222Rn emissions into GEOS-Chem/MERRA and compare the simulation results with those using the relatively uniform 222Rn emissions in the standard model. We evaluate the global distribution and seasonality of 222Rn concentrations simulated by the two models against an extended collection of 222Rn observations from 1970s to 2010s. The intercomparison will improve our understanding of the spatial variability in global 222Rn emissions, including the suspected excessive 222Rn emissions in East Asia, and provide useful feedbacks on 222Rn emission models. We will assess 222Rn vertical distributions at different latitudes in the models using observations at surface sites and in the upper troposphere and lower stratosphere. Results will be compared with previous models driven by other meteorological fields (e.g., fvGCM and GEOS4). Since the decay of 222Rn is the source of 210Pb, a useful radionuclide tracer attached to submicron aerosols, improved understanding of emissions and transport of 222Rn will provide insights into the transport, distribution, and wet deposition of 210Pb aerosols.

  11. EMISSION AND SURFACE EXCHANGE PROCESS

    EPA Science Inventory

    This task supports the development, evaluation, and application of emission and dry deposition algorithms in air quality simulation models, such as the Models-3/Community Multiscale Air Quality (CMAQ) modeling system. Emission estimates influence greatly the accuracy of air qual...

  12. Scale Issues in Air Quality Modeling

    EPA Science Inventory

    This presentation reviews past model evaluation studies investigating the impact of horizontal grid spacing on model performance. It also presents several examples of using a spectral decomposition technique to separate the forcings from processes operating on different time scal...

  13. Ground robotic measurement of aeolian processes

    USDA-ARS?s Scientific Manuscript database

    Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These d...

  14. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  15. What Counts is not Falling … but Landing1

    PubMed Central

    BROUSSELLE, ASTRID

    2012-01-01

    Implementation evaluations, also called process evaluations, involve studying the development of programmes, and identifying and understanding their strengths and weaknesses. Undertaking an implementation evaluation offers insights into evaluation objectives, but does not help the researcher develop a research strategy. During the implementation analysis of the UNAIDS drug access initiative in Chile, the strategic analysis model developed by Crozier and Friedberg was used. However, a major incompatibility was noted between the procedure put forward by Crozier and Friedberg and the specific characteristics of the programme being evaluated. In this article, an adapted strategic analysis model for programme evaluation is proposed. PMID:23526306

  16. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  17. Development and implementation of an independence rating scale and evaluation process for nursing orientation of new graduates.

    PubMed

    Durkin, Gregory J

    2010-01-01

    A wide variety of evaluation formats are available for new graduate nurses, but most of them are single-point evaluation tools that do not provide a clear picture of progress for orientee or educator. This article describes the development of a Web-based evaluation tool that combines learning taxonomies with the Synergy model into a rating scale based on independent performance. The evaluation tool and process provides open 24/7 access to evaluation documentation for members of the orientation team, demystifying the process and clarifying expectations. The implementation of the tool has proven to be transformative in the perceptions of evaluation and performance expectations of new graduates. This tool has been successful at monitoring progress, altering education, and opening dialogue about performance for over 125 new graduate nurses since inception.

  18. Damage to the ventromedial prefrontal cortex is associated with impairments in both spontaneous and deliberative moral judgments.

    PubMed

    Cameron, C Daryl; Reber, Justin; Spring, Victoria L; Tranel, Daniel

    2018-03-01

    Implicit moral evaluations-spontaneous, unintentional judgments about the moral status of actions or persons-are thought to play a pivotal role in moral experience, suggesting a need for research to model these moral evaluations in clinical populations. Prior research reveals that the ventromedial prefrontal cortex (vmPFC) is a critical area underpinning affect and morality, and patients with vmPFC lesions show abnormalities in moral judgment and moral behavior. We use indirect measurement and multinomial modeling to understand differences in implicit moral evaluations among patients with vmPFC lesions. Our model quantifies multiple processes of moral judgment: implicit moral evaluations in response to distracting moral transgressions (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Compared to individuals with non-vmPFC brain damage and neurologically healthy comparisons, patients with vmPFC lesions showed a dual deficit in processes of moral judgment. First, patients with vmPFC lesions showed reduced Unintentional Judgment about moral transgressions, but not about non-moral negative affective distracters. Second, patients with vmPFC lesions showed reduced Intentional Judgment about target actions. These findings highlight the utility of a formal modeling approach in moral psychology, revealing a dual deficit in multiple component processes of moral judgment among patients with vmPFC lesions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Parameterization and Sensitivity Analysis of a Complex Simulation Model for Mosquito Population Dynamics, Dengue Transmission, and Their Control

    PubMed Central

    Ellis, Alicia M.; Garcia, Andres J.; Focks, Dana A.; Morrison, Amy C.; Scott, Thomas W.

    2011-01-01

    Models can be useful tools for understanding the dynamics and control of mosquito-borne disease. More detailed models may be more realistic and better suited for understanding local disease dynamics; however, evaluating model suitability, accuracy, and performance becomes increasingly difficult with greater model complexity. Sensitivity analysis is a technique that permits exploration of complex models by evaluating the sensitivity of the model to changes in parameters. Here, we present results of sensitivity analyses of two interrelated complex simulation models of mosquito population dynamics and dengue transmission. We found that dengue transmission may be influenced most by survival in each life stage of the mosquito, mosquito biting behavior, and duration of the infectious period in humans. The importance of these biological processes for vector-borne disease models and the overwhelming lack of knowledge about them make acquisition of relevant field data on these biological processes a top research priority. PMID:21813844

  20. Dawn: A Simulation Model for Evaluating Costs and Tradeoffs of Big Data Science Architectures

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Crichton, D. J.; Braverman, A. J.; Kyo, L.; Fuchs, T.; Turmon, M.

    2014-12-01

    In many scientific disciplines, scientists and data managers are bracing for an upcoming deluge of big data volumes, which will increase the size of current data archives by a factor of 10-100 times. For example, the next Climate Model Inter-comparison Project (CMIP6) will generate a global archive of model output of approximately 10-20 Peta-bytes, while the upcoming next generation of NASA decadal Earth Observing instruments are expected to collect tens of Giga-bytes/day. In radio-astronomy, the Square Kilometre Array (SKA) will collect data in the Exa-bytes/day range, of which (after reduction and processing) around 1.5 Exa-bytes/year will be stored. The effective and timely processing of these enormous data streams will require the design of new data reduction and processing algorithms, new system architectures, and new techniques for evaluating computation uncertainty. Yet at present no general software tool or framework exists that will allow system architects to model their expected data processing workflow, and determine the network, computational and storage resources needed to prepare their data for scientific analysis. In order to fill this gap, at NASA/JPL we have been developing a preliminary model named DAWN (Distributed Analytics, Workflows and Numerics) for simulating arbitrary complex workflows composed of any number of data processing and movement tasks. The model can be configured with a representation of the problem at hand (the data volumes, the processing algorithms, the available computing and network resources), and is able to evaluate tradeoffs between different possible workflows based on several estimators: overall elapsed time, separate computation and transfer times, resulting uncertainty, and others. So far, we have been applying DAWN to analyze architectural solutions for 4 different use cases from distinct science disciplines: climate science, astronomy, hydrology and a generic cloud computing use case. This talk will present preliminary results and discuss how DAWN can be evolved into a powerful tool for designing system architectures for data intensive science.

  1. SEIPS-based process modeling in primary care.

    PubMed

    Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T

    2017-04-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. SEIPS-Based Process Modeling in Primary Care

    PubMed Central

    Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter

    2016-01-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883

  3. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  4. The Importance of Uncertainty and Sensitivity Analysis in Process-based Models of Carbon and Nitrogen Cycling in Terrestrial Ecosystems with Particular Emphasis on Forest Ecosystems — Selected Papers from a Workshop Organized by the International Society for Ecological Modelling (ISEM) at the Third Biennal Meeting of the International Environmental Modelling and Software Society (IEMSS) in Burlington, Vermont, USA, August 9-13, 2006

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Liu, Jinxun; Ascough, James C.; Gordon, Andrew M.

    2008-01-01

    Many process-based models of carbon (C) and nitrogen (N) cycles have been developed for terrestrial ecosystems, including forest ecosystems. They address many basic issues of ecosystems structure and functioning, such as the role of internal feedback in ecosystem dynamics. The critical factor in these phenomena is scale, as these processes operate at scales from the minute (e.g. particulate pollution impacts on trees and other organisms) to the global (e.g. climate change). Research efforts remain important to improve the capability of such models to better represent the dynamics of terrestrial ecosystems, including the C, nutrient, (e.g. N) and water cycles. Existing models are sufficiently well advanced to help decision makers develop sustainable management policies and planning of terrestrial ecosystems, as they make realistic predictions when used appropriately. However, decision makers must be aware of their limitations by having the opportunity to evaluate the uncertainty associated with process-based models (Smith and Heath, 2001 and Allen et al., 2004). The variation in scale of issues currently being addressed by modelling efforts makes the evaluation of uncertainty a daunting task.

  5. 2016 International Land Model Benchmarking (ILAMB) Workshop Report

    NASA Technical Reports Server (NTRS)

    Hoffman, Forrest M.; Koven, Charles D.; Keppel-Aleks, Gretchen; Lawrence, David M.; Riley, William J.; Randerson, James T.; Ahlstrom, Anders; Abramowitz, Gabriel; Baldocchi, Dennis D.; Best, Martin J.; hide

    2016-01-01

    As earth system models (ESMs) become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of terrestrial biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistryclimate feedbacks and ecosystem processes in these models are essential for reducing the acknowledged substantial uncertainties in 21st century climate change projections.

  6. 2016 International Land Model Benchmarking (ILAMB) Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forrest M.; Koven, Charles D.; Keppel-Aleks, Gretchen

    As Earth system models become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistry–climate feedbacks and ecosystem processes in these models are essential for reducing uncertainties associated with projections of climate change during the remainder of the 21st century.

  7. Engaged for Change: A Community-Engaged Process for Developing Interventions to Reduce Health Disparities.

    PubMed

    Rhodes, Scott D; Mann-Jackson, Lilli; Alonzo, Jorge; Simán, Florence M; Vissman, Aaron T; Nall, Jennifer; Abraham, Claire; Aronson, Robert E; Tanner, Amanda E

    2017-12-01

    The science underlying the development of individual, community, system, and policy interventions designed to reduce health disparities has lagged behind other innovations. Few models, theoretical frameworks, or processes exist to guide intervention development. Our community-engaged research partnership has been developing, implementing, and evaluating efficacious interventions to reduce HIV disparities for over 15 years. Based on our intervention research experiences, we propose a novel 13-step process designed to demystify and guide intervention development. Our intervention development process includes steps such as establishing an intervention team to manage the details of intervention development; assessing community needs, priorities, and assets; generating intervention priorities; evaluating and incorporating theory; developing a conceptual or logic model; crafting activities; honing materials; administering a pilot, noting its process, and gathering feedback from all those involved; and editing the intervention based on what was learned. Here, we outline and describe each of these 13 steps.

  8. Evaluating models of healthcare delivery using the Model of Care Evaluation Tool (MCET).

    PubMed

    Hudspeth, Randall S; Vogt, Marjorie; Wysocki, Ken; Pittman, Oralea; Smith, Susan; Cooke, Cindy; Dello Stritto, Rita; Hoyt, Karen Sue; Merritt, T Jeanne

    2016-08-01

    Our aim was to provide the outcome of a structured Model of Care (MoC) Evaluation Tool (MCET), developed by an FAANP Best-practices Workgroup, that can be used to guide the evaluation of existing MoCs being considered for use in clinical practice. Multiple MoCs are available, but deciding which model of health care delivery to use can be confusing. This five-component tool provides a structured assessment approach to model selection and has universal application. A literature review using CINAHL, PubMed, Ovid, and EBSCO was conducted. The MCET evaluation process includes five sequential components with a feedback loop from component 5 back to component 3 for reevaluation of any refinements. The components are as follows: (1) Background, (2) Selection of an MoC, (3) Implementation, (4) Evaluation, and (5) Sustainability and Future Refinement. This practical resource considers an evidence-based approach to use in determining the best model to implement based on need, stakeholder considerations, and feasibility. ©2015 American Association of Nurse Practitioners.

  9. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  10. Tinnitus: a management model.

    PubMed

    Stephens, S D; Hallam, R S; Jakes, S C

    1986-08-01

    A comprehensive model of tinnitus management is proposed. As it is rarely possible to abolish the symptom, management of the tinnitus patient must aim at precipitating the habituation process. The model is split into 'evaluation' and 'remediation' sections. In each section the various aspects of management are discussed. Together with traditional factors, the importance of psychological processes is stressed. The role of the expectations of the patient in limiting remedial possibilities is also discussed.

  11. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  12. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  13. The Effect of Persuasion on the Utilization of Program Evaluation Information: A Preliminary Study.

    ERIC Educational Resources Information Center

    Eason, Sandra H.; Thompson, Bruce

    The utilization of program evaluation may be made more effective by means of the application of contemporary persuasion theory. The Elaboration Likelihood Model--a model of cognitive processing, ability, and motivation--was used in this study to test the persuasive effects of source credibility and involvement on message acceptance of evaluation…

  14. Evaluating the Classical Versus an Emerging Conceptual Model of Peatland Methane Dynamics

    Treesearch

    Wendy H. Yang; Gavin McNicol; Yit Arn Teh; Katerina Estera-Molina; Tana E. Wood; Whendee L. Silver

    2017-01-01

    Methane (CH4) is a potent greenhouse gas that is both produced and consumed in soils by microbially mediated processes sensitive to soil redox. We evaluated the classical conceptual model of peatland CH4 dynamics—in which the water table position determines the vertical distribution of methanogenesis and methanotrophy—...

  15. Farm simulation: a tool for evaluating the mitigation of greenhouse gas emissions and the adaptation of dairy production to climate change

    USDA-ARS?s Scientific Manuscript database

    Process-level modeling at the farm scale provides a tool for evaluating both strategies for mitigating greenhouse gas emissions and strategies for adapting to climate change. The Integrated Farm System Model (IFSM) simulates representative crop, beef or dairy farms over many years of weather to pred...

  16. An Evaluation Model To Select an Integrated Learning System in a Large, Suburban School District.

    ERIC Educational Resources Information Center

    Curlette, William L.; And Others

    The systematic evaluation process used in Georgia's DeKalb County School System to purchase comprehensive instructional software--an integrated learning system (ILS)--is described, and the decision-making model for selection is presented. Selection and implementation of an ILS were part of an instructional technology plan for the DeKalb schools…

  17. Evidence Evaluation: Measure "Z" Corresponds to Human Utility Judgments Better than Measure "L" and Optimal-Experimental-Design Models

    ERIC Educational Resources Information Center

    Rusconi, Patrice; Marelli, Marco; D'Addario, Marco; Russo, Selena; Cherubini, Paolo

    2014-01-01

    Evidence evaluation is a crucial process in many human activities, spanning from medical diagnosis to impression formation. The present experiments investigated which, if any, normative model best conforms to people's intuition about the value of the obtained evidence. Psychologists, epistemologists, and philosophers of science have proposed…

  18. Land-use evaluation for sustainable construction in a protected area: A case of Sara mountain national park.

    PubMed

    Ristić, Vladica; Maksin, Marija; Nenković-Riznić, Marina; Basarić, Jelena

    2018-01-15

    The process of making decisions on sustainable development and construction begins in spatial and urban planning when defining the suitability of using land for sustainable construction in a protected area (PA) and its immediate and regional surroundings. The aim of this research is to propose and assess a model for evaluating land-use suitability for sustainable construction in a PA and its surroundings. The methodological approach of Multi-Criteria Decision Analysis was used in the formation of this model and adapted for the research; it was combined with the adapted Analytical hierarchy process and the Delphi process, and supported by a geographical information system (GIS) within the framework of ESRI ArcGIS software - Spatial analyst. The model is applied to the case study of Sara mountain National Park in Kosovo. The result of the model is a "map of integrated assessment of land-use suitability for sustainable construction in a PA for the natural factor". Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Internal Catchment Process Simulation in a Snow-Dominated Basin: Performance Evaluation with Spatiotemporally Variable Runoff Generation and Groundwater Dynamics

    NASA Astrophysics Data System (ADS)

    Kuras, P. K.; Weiler, M.; Alila, Y.; Spittlehouse, D.; Winkler, R.

    2006-12-01

    Hydrologic models have been increasingly used in forest hydrology to overcome the limitations of paired watershed experiments, where vegetative recovery and natural variability obscure the inferences and conclusions that can be drawn from such studies. Models, however, are also plagued by uncertainty stemming from a limited understanding of hydrological processes in forested catchments and parameter equifinality is a common concern. This has created the necessity to improve our understanding of how hydrological systems work, through the development of hydrological measures, analyses and models that address the question: are we getting the right answers for the right reasons? Hence, physically-based, spatially-distributed hydrologic models should be validated with high-quality experimental data describing multiple concurrent internal catchment processes under a range of hydrologic regimes. The distributed hydrology soil vegetation model (DHSVM) frequently used in forest management applications is an example of a process-based model used to address the aforementioned circumstances, and this study takes a novel approach at collectively examining the ability of a pre-calibrated model application to realistically simulate outlet flows along with the spatial-temporal variation of internal catchment processes including: continuous groundwater dynamics at 9 locations, stream and road network flow at 67 locations for six individual days throughout the freshet, and pre-melt season snow distribution. Model efficiency was improved over prior evaluations due to continuous efforts in improving the quality of meteorological data in the watershed. Road and stream network flows were very well simulated for a range of hydrological conditions, and the spatial distribution of the pre-melt season snowpack was in general agreement with observed values. The model was effective in simulating the spatial variability of subsurface flow generation, except at locations where strong stream-groundwater interactions existed, as the model is not capable of simulating such processes and subsurface flows always drain to the stream network. The model has proven overall to be quite capable in realistically simulating internal catchment processes in the watershed, which creates more confidence in future model applications exploring the effects of various forest management scenarios on the watershed's hydrological processes.

  20. Self and Superior Assessment.

    DTIC Science & Technology

    1986-06-01

    model of the self-evaluation process as it differs from the evaluation process used by superiors. Symbolic Interactionism One view of self assessment is...supplied by the symbolic interactionists (Cooley, 1902; Head, 1934), who state that self perceptions are generated largely from individuals...disagreements remained even immediately after an appraisal interview in which a great deal of feedback was given. Research on the symbolic interactionist

  1. Physical Education Resources, Class Management, and Student Physical Activity Levels: A Structure-Process-Outcome Approach to Evaluating Physical Education Effectiveness

    ERIC Educational Resources Information Center

    Bevans, Katherine B.; Fitzpatrick, Leslie-Anne; Sanchez, Betty M.; Riley, Anne W.; Forrest, Christopher

    2010-01-01

    Background: This study was conducted to empirically evaluate specific human, curricular, and material resources that maximize student opportunities for physical activity during physical education (PE) class time. A structure-process-outcome model was proposed to identify the resources that influence the frequency of PE and intensity of physical…

  2. Application of Hierarchy Theory to Cross-Scale Hydrologic Modeling of Nutrient Loads

    EPA Science Inventory

    We describe a model called Regional Hydrologic Modeling for Environmental Evaluation 16 (RHyME2) for quantifying annual nutrient loads in stream networks and watersheds. RHyME2 is 17 a cross-scale statistical and process-based water-quality model. The model ...

  3. Method of evaluating the impact of ERP implementation critical success factors - a case study in oil and gas industries

    NASA Astrophysics Data System (ADS)

    Gajic, Gordana; Stankovski, Stevan; Ostojic, Gordana; Tesic, Zdravko; Miladinovic, Ljubomir

    2014-01-01

    The so far implemented enterprise resource planning (ERP) systems have in many cases failed to meet the requirements regarding the business process control, decrease of business costs and increase of company profit margin. Therefore, there is a real need for an evaluation of the influence of ERP on the company's performance indicators. Proposed in this article is an advanced model for the evaluation of the success of ERP implementation on organisational and operational performance indicators in oil-gas companies. The recommended method establishes a correlation between a process-based method, a scorecard model and ERP critical success factors. The method was verified and tested on two case studies in oil-gas companies using the following procedure: the model was developed, tested and implemented in a pilot gas-oil company, while the results were implemented and verified in another gas-oil company.

  4. Improved Geothermometry Through Multivariate Reaction-path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mattson, Earl; Smith, Robert; Fujita, Yoshiko

    2015-03-01

    The project was aimed at demonstrating that the geothermometric predictions can be improved through the application of multi-element reaction path modeling that accounts for lithologic and tectonic settings, while also accounting for biological influences on geochemical temperature indicators. The limited utilization of chemical signatures by individual traditional geothermometer in the development of reservoir temperature estimates may have been constraining their reliability for evaluation of potential geothermal resources. This project, however, was intended to build a geothermometry tool which can integrate multi-component reaction path modeling with process-optimization capability that can be applied to dilute, low-temperature water samples to consistently predict reservoirmore » temperature within ±30 °C. The project was also intended to evaluate the extent to which microbiological processes can modulate the geochemical signals in some thermal waters and influence the geothermometric predictions.« less

  5. Program management model study

    NASA Technical Reports Server (NTRS)

    Connelly, J. J.; Russell, J. E.; Seline, J. R.; Sumner, N. R., Jr.

    1972-01-01

    Two models, a system performance model and a program assessment model, have been developed to assist NASA management in the evaluation of development alternatives for the Earth Observations Program. Two computer models were developed and demonstrated on the Goddard Space Flight Center Computer Facility. Procedures have been outlined to guide the user of the models through specific evaluation processes, and the preparation of inputs describing earth observation needs and earth observation technology. These models are intended to assist NASA in increasing the effectiveness of the overall Earth Observation Program by providing a broader view of system and program development alternatives.

  6. Evaluation of Rainfall-Runoff Models for Mediterranean Subcatchments

    NASA Astrophysics Data System (ADS)

    Cilek, A.; Berberoglu, S.; Donmez, C.

    2016-06-01

    The development and the application of rainfall-runoff models have been a corner-stone of hydrological research for many decades. The amount of rainfall and its intensity and variability control the generation of runoff and the erosional processes operating at different scales. These interactions can be greatly variable in Mediterranean catchments with marked hydrological fluctuations. The aim of the study was to evaluate the performance of rainfall-runoff model, for rainfall-runoff simulation in a Mediterranean subcatchment. The Pan-European Soil Erosion Risk Assessment (PESERA), a simplified hydrological process-based approach, was used in this study to combine hydrological surface runoff factors. In total 128 input layers derived from data set includes; climate, topography, land use, crop type, planting date, and soil characteristics, are required to run the model. Initial ground cover was estimated from the Landsat ETM data provided by ESA. This hydrological model was evaluated in terms of their performance in Goksu River Watershed, Turkey. It is located at the Central Eastern Mediterranean Basin of Turkey. The area is approximately 2000 km2. The landscape is dominated by bare ground, agricultural and forests. The average annual rainfall is 636.4mm. This study has a significant importance to evaluate different model performances in a complex Mediterranean basin. The results provided comprehensive insight including advantages and limitations of modelling approaches in the Mediterranean environment.

  7. The difference between energy consumption and energy cost: Modelling energy tariff structures for water resource recovery facilities.

    PubMed

    Aymerich, I; Rieger, L; Sobhani, R; Rosso, D; Corominas, Ll

    2015-09-15

    The objective of this paper is to demonstrate the importance of incorporating more realistic energy cost models (based on current energy tariff structures) into existing water resource recovery facilities (WRRFs) process models when evaluating technologies and cost-saving control strategies. In this paper, we first introduce a systematic framework to model energy usage at WRRFs and a generalized structure to describe energy tariffs including the most common billing terms. Secondly, this paper introduces a detailed energy cost model based on a Spanish energy tariff structure coupled with a WRRF process model to evaluate several control strategies and provide insights into the selection of the contracted power structure. The results for a 1-year evaluation on a 115,000 population-equivalent WRRF showed monthly cost differences ranging from 7 to 30% when comparing the detailed energy cost model to an average energy price. The evaluation of different aeration control strategies also showed that using average energy prices and neglecting energy tariff structures may lead to biased conclusions when selecting operating strategies or comparing technologies or equipment. The proposed framework demonstrated that for cost minimization, control strategies should be paired with a specific optimal contracted power. Hence, the design of operational and control strategies must take into account the local energy tariff. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Neonatal Intensive Care Nursing Curriculum Challenges based on Context, Input, Process, and Product Evaluation Model: A Qualitative Study.

    PubMed

    Ashghali-Farahani, Mansoureh; Ghaffari, Fatemeh; Hoseini-Esfidarjani, Sara-Sadat; Hadian, Zahra; Qomi, Robabeh; Dargahi, Helen

    2018-01-01

    Weakness of curriculum development in nursing education results in lack of professional skills in graduates. This study was done on master's students in nursing to evaluate challenges of neonatal intensive care nursing curriculum based on context, input, process, and product (CIPP) evaluation model. This study was conducted with qualitative approach, which was completed according to the CIPP evaluation model. The study was conducted from May 2014 to April 2015. The research community included neonatal intensive care nursing master's students, the graduates, faculty members, neonatologists, nurses working in neonatal intensive care unit (NICU), and mothers of infants who were hospitalized in such wards. Purposeful sampling was applied. The data analysis showed that there were two main categories: "inappropriate infrastructure" and "unknown duties," which influenced the context formation of NICU master's curriculum. The input was formed by five categories, including "biomedical approach," "incomprehensive curriculum," "lack of professional NICU nursing mentors," "inappropriate admission process of NICU students," and "lack of NICU skill labs." Three categories were extracted in the process, including "more emphasize on theoretical education," "the overlap of credits with each other and the inconsistency among the mentors," and "ineffective assessment." Finally, five categories were extracted in the product, including "preferring routine work instead of professional job," "tendency to leave the job," "clinical incompetency of graduates," "the conflict between graduates and nursing staff expectations," and "dissatisfaction of graduates." Some changes are needed in NICU master's curriculum by considering the nursing experts' comments and evaluating the consequences of such program by them.

  9. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  10. A Model for Long Range Planning for Seminole Community College.

    ERIC Educational Resources Information Center

    Miner, Norris

    A model for long-range planning designed to maximize involvement of college personnel, to improve communication among various areas of the college, to provide a process for evaluation of long-range plans and the planning process, to adjust to changing conditions, to utilize data developed at a level useful for actual operations, and to have…

  11. A Structural Equation Model of the Writing Process in Typically-Developing Sixth Grade Children

    ERIC Educational Resources Information Center

    Koutsoftas, Anthony D.; Gray, Shelley

    2013-01-01

    The purpose of this study was to evaluate how sixth grade children planned, translated, and revised written narrative stories using a task reflecting current instructional and assessment practices. A modified version of the Hayes and Flower (1980) writing process model was used as the theoretical framework for the study. Two hundred one…

  12. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  13. An Interdisciplinary Approach to Designing Online Learning: Fostering Pre-Service Mathematics Teachers' Capabilities in Mathematical Modelling

    ERIC Educational Resources Information Center

    Geiger, Vince; Mulligan, Joanne; Date-Huxtable, Liz; Ahlip, Rehez; Jones, D. Heath; May, E. Julian; Rylands, Leanne; Wright, Ian

    2018-01-01

    In this article we describe and evaluate processes utilized to develop an online learning module on mathematical modelling for pre-service teachers. The module development process involved a range of professionals working within the STEM disciplines including mathematics and science educators, mathematicians, scientists, in-service and pre-service…

  14. Oncology Modeling for Fun and Profit! Key Steps for Busy Analysts in Health Technology Assessment.

    PubMed

    Beca, Jaclyn; Husereau, Don; Chan, Kelvin K W; Hawkins, Neil; Hoch, Jeffrey S

    2018-01-01

    In evaluating new oncology medicines, two common modeling approaches are state transition (e.g., Markov and semi-Markov) and partitioned survival. Partitioned survival models have become more prominent in oncology health technology assessment processes in recent years. Our experience in conducting and evaluating models for economic evaluation has highlighted many important and practical pitfalls. As there is little guidance available on best practices for those who wish to conduct them, we provide guidance in the form of 'Key steps for busy analysts,' who may have very little time and require highly favorable results. Our guidance highlights the continued need for rigorous conduct and transparent reporting of economic evaluations regardless of the modeling approach taken, and the importance of modeling that better reflects reality, which includes better approaches to considering plausibility, estimating relative treatment effects, dealing with post-progression effects, and appropriate characterization of the uncertainty from modeling itself.

  15. Post Occupancy Evaluation of Educational Buildings and Equipment.

    ERIC Educational Resources Information Center

    Watson, Chris

    1997-01-01

    Details the post occupancy evaluation (POE) process for public buildings. POEs are used to improve design and optimize educational building and equipment use. The evaluation participants, the method used, the results and recommendations, model schools, and classroom alterations using POE are described. (9 references.) (RE)

  16. Cost of ownership for inspection equipment

    NASA Astrophysics Data System (ADS)

    Dance, Daren L.; Bryson, Phil

    1993-08-01

    Cost of Ownership (CoO) models are increasingly a part of the semiconductor equipment evaluation and selection process. These models enable semiconductor manufacturers and equipment suppliers to quantify a system in terms of dollars per wafer. Because of the complex nature of the semiconductor manufacturing process, there are several key attributes that must be considered in order to accurately reflect the true 'cost of ownership'. While most CoO work to date has been applied to production equipment, the need to understand cost of ownership for inspection and metrology equipment presents unique challenges. Critical parameters such as detection sensitivity as a function of size and type of defect are not included in current CoO models yet are, without question, major factors in the technical evaluation process and life-cycle cost. This paper illustrates the relationship between these parameters, as components of the alpha and beta risk, and cost of ownership.

  17. Valence and arousal-based affective evaluations of foods.

    PubMed

    Woodward, Halley E; Treat, Teresa A; Cameron, C Daryl; Yegorova, Vitaliya

    2017-01-01

    We investigated the nutrient-specific and individual-specific validity of dual-process models of valenced and arousal-based affective evaluations of foods across the disordered eating spectrum. 283 undergraduate women provided implicit and explicit valence and arousal-based evaluations of 120 food photos with known nutritional information on structurally similar indirect and direct affect misattribution procedures (AMP; Payne et al., 2005, 2008), and completed questionnaires assessing body mass index (BMI), hunger, restriction, and binge eating. Nomothetically, added fat and added sugar enhance evaluations of foods. Idiographically, hunger and binge eating enhance activation, whereas BMI and restriction enhance pleasantness. Added fat is salient for women who are heavier, hungrier, or who restrict; added sugar is influential for less hungry women. Restriction relates only to valence, whereas binge eating relates only to arousal. Findings are similar across implicit and explicit affective evaluations, albeit stronger for explicit, providing modest support for dual-process models of affective evaluation of foods. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Evaluation of an aged care nurse practitioner service: quality of care within a residential aged care facility hospital avoidance service.

    PubMed

    Dwyer, Trudy; Craswell, Alison; Rossi, Dolene; Holzberger, Darren

    2017-01-13

    Reducing avoidable hospitialisation of aged care facility (ACF) residents can improve the resident experience and their health outcomes. Consequently many variations of hospital avoidance (HA) programs continue to evolve. Nurse practitioners (NP) with expertise in aged care have the potential to make a unique contribution to hospital avoidance programs. However, little attention has been dedicated to service evaluation of this model and the quality of care provided. The purpose of this study was to evaluate the quality of an aged care NP model of care situated within a HA service in a regional area of Australia. Donabedian's structure, process and outcome framework was applied to evaluate the quality of the NP model of care. The Australian Nurse Practitioner Study standardised interview schedules for evaluating NP models of care guided the semi-structured interviews of nine health professionals (including ACF nurses, medical doctors and allied health professionals), four ACF residents and their families and two NPs. Theory driven coding consistent with the Donabedian framework guided analysis of interview data and presentation of findings. Structural dimensions identified included the 'in-reach' nature of the HA service, distance, limitations of professional regulation and the residential care model. These dimensions influenced the process of referring the resident to the NP, the NPs timely response and interactions with other professionals. The processes where the NPs take time connecting with residents, initiating collaborative care plans, up-skilling aged care staff and function as intra and interprofessional boundary spanners all contributed to quality outcomes. Quality outcomes in this study were about timely intervention, HA, timely return home, partnering with residents and family (knowing what they want) and resident and health professional satisfaction. This study provides valuable insights into the contribution of the NP model of care within an aged care, HA service and how staff manipulated the process dimensions to improve referral to the NPs. NP service in this study was dynamic, flexible and responsive to both patient and organisational demands.

  19. Computational models of human vision with applications

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.

    1985-01-01

    Perceptual problems in aeronautics were studied. The mechanism by which color constancy is achieved in human vision was examined. A computable algorithm was developed to model the arrangement of retinal cones in spatial vision. The spatial frequency spectra are similar to the spectra of actual cone mosaics. The Hartley transform as a tool of image processing was evaluated and it is suggested that it could be used in signal processing applications, GR image processing.

  20. Modeling Healthcare Processes Using Commitments: An Empirical Evaluation.

    PubMed

    Telang, Pankaj R; Kalia, Anup K; Singh, Munindar P

    2015-01-01

    The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7-each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student's t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel.

  1. Modeling Healthcare Processes Using Commitments: An Empirical Evaluation

    PubMed Central

    2015-01-01

    The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7—each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student’s t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel. PMID:26539985

  2. Development and testing of controller performance evaluation methodology for multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1991-01-01

    Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.

  3. Psychometric assessment of the processes of change scale for sun protection.

    PubMed

    Sillice, Marie A; Babbin, Steven F; Redding, Colleen A; Rossi, Joseph S; Paiva, Andrea L; Velicer, Wayne F

    2018-01-01

    The fourteen-factor Processes of Change Scale for Sun Protection assesses behavioral and experiential strategies that underlie the process of sun protection acquisition and maintenance. Variations of this measure have been used effectively in several randomized sun protection trials, both for evaluation and as a basis for intervention. However, there are no published studies, to date, that evaluate the psychometric properties of the scale. The present study evaluated factorial invariance and scale reliability in a national sample (N = 1360) of adults involved in a Transtheoretical model tailored intervention for exercise and sun protection, at baseline. Invariance testing ranged from least to most restrictive: Configural Invariance (constraints only factor structure and zero loadings); Pattern Identity Invariance (equal factor loadings across target groups); and Strong Factorial Invariance (equal factor loadings and measurement errors). Multi-sample structural equation modeling tested the invariance of the measurement model across seven subgroups: age, education, ethnicity, gender, race, skin tone, and Stage of Change for Sun Protection. Strong factorial invariance was found across all subgroups. Internal consistency coefficient Alpha and factor rho reliability, respectively, were .83 and .80 for behavioral processes, .91 and .89 for experiential processes, and .93 and .91 for the global scale. These results provide strong empirical evidence that the scale is consistent, has internal validity and can be used in research interventions with population-based adult samples.

  4. NASA GPM GV Science Implementation

    NASA Technical Reports Server (NTRS)

    Petersen, W. A.

    2009-01-01

    Pre-launch algorithm development & post-launch product evaluation: The GPM GV paradigm moves beyond traditional direct validation/comparison activities by incorporating improved algorithm physics & model applications (end-to-end validation) in the validation process. Three approaches: 1) National Network (surface): Operational networks to identify and resolve first order discrepancies (e.g., bias) between satellite and ground-based precipitation estimates. 2) Physical Process (vertical column): Cloud system and microphysical studies geared toward testing and refinement of physically-based retrieval algorithms. 3) Integrated (4-dimensional): Integration of satellite precipitation products into coupled prediction models to evaluate strengths/limitations of satellite precipitation producers.

  5. Modeling snow accumulation and ablation processes in forested environments

    NASA Astrophysics Data System (ADS)

    Andreadis, Konstantinos M.; Storck, Pascal; Lettenmaier, Dennis P.

    2009-05-01

    The effects of forest canopies on snow accumulation and ablation processes can be very important for the hydrology of midlatitude and high-latitude areas. A mass and energy balance model for snow accumulation and ablation processes in forested environments was developed utilizing extensive measurements of snow interception and release in a maritime mountainous site in Oregon. The model was evaluated using 2 years of weighing lysimeter data and was able to reproduce the snow water equivalent (SWE) evolution throughout winters both beneath the canopy and in the nearby clearing, with correlations to observations ranging from 0.81 to 0.99. Additionally, the model was evaluated using measurements from a Boreal Ecosystem-Atmosphere Study (BOREAS) field site in Canada to test the robustness of the canopy snow interception algorithm in a much different climate. Simulated SWE was relatively close to the observations for the forested sites, with discrepancies evident in some cases. Although the model formulation appeared robust for both types of climates, sensitivity to parameters such as snow roughness length and maximum interception capacity suggested the magnitude of improvements of SWE simulations that might be achieved by calibration.

  6. An Overview of Atmospheric Chemistry and Air Quality Modeling

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew S.

    2017-01-01

    This presentation will include my personal research experience and an overview of atmospheric chemistry and air quality modeling to the participants of the NASA Student Airborne Research Program (SARP 2017). The presentation will also provide examples on ways to apply airborne observations for chemical transport (CTM) and air quality (AQ) model evaluation. CTM and AQ models are important tools in understanding tropospheric-stratospheric composition, atmospheric chemistry processes, meteorology, and air quality. This presentation will focus on how NASA scientist currently apply CTM and AQ models to better understand these topics. Finally, the importance of airborne observation in evaluating these topics and how in situ and remote sensing observations can be used to evaluate and improve CTM and AQ model predictions will be highlighted.

  7. Using the learning management evaluation model for advancing to life skills of lower secondary students in the 21st century

    NASA Astrophysics Data System (ADS)

    Kansaart, Preecha; Suikraduang, Arun; Panya, Piyatida

    2018-01-01

    The aims of this research study were to develop the Learning Management Evaluation Model (LMEM) for advancing to lower secondary students of their life skills in the 21st century with the Research & Development process technique. The research procedures were administered of four steps that composed of analyze, the synthetic indicator to assess learning to advance to their life skills in the 21st century by the 4-educational experts were interviewed. The LMEM model was developed by the information from the first draft format and the educational experts to check a suitability and feasibility of the draft assessment form with a technical symposium multipath characteristics to find consensus dimensional (Multi-Attribute Consensus Reaching: MACR) by 12 specialists who provided the instruction in the form of Assessment and Evaluation Guide (AEG) was brought to five the number of professionals who ensure the proper coverage, a clear assessment of the manual before using the AEG. The LMEM model was to trial at an experiment with different schools in the Secondary Educational Office Area 26 (Maha Sarakham) whereas taught at the upper secondary educational school with the sample consisted of 7 schools with the purposive sampling was selected. Assessing the LMEM model was evaluated the based on the evaluation criteria of the educational development. The assessor was related to the trial consisted of 35 evaluators. Using the interview form with the rubric score and a five rating scale level was analyzed; the qualitative and quantitative data were used. It has found that: The LMEM evaluation model of learning to advance to life skills of students in the 21st century was a chart structure that ties together of 6 relevant components of the evaluation such as; the purpose of the assessment, the evaluation focused assessment methods, the evaluator, the evaluation technique, and the evaluation criteria. The evaluation targets were to assess the management of learning, the factors contributing to learning, feature teacher management learning, and the learning outcomes. Evaluating methods included with the evaluation process, the tool used to evaluate, and duration to assess. Assessing the LMEM model of learning to advance to students of their life skills in the 21st century were appropriated ability. Students' responses of their opportune, practicability, reasonableness, and respectability in terms of overall benefit at a high level are provided.

  8. NARSTO critical review of photochemical models and modeling

    NASA Astrophysics Data System (ADS)

    Russell, Armistead; Dennis, Robin

    Photochemical air quality models play a central role in both schentific investigation of how pollutants evlove in the atmosphere as well as developing policies to manage air quality. In the past 30 years, these models have evolved from rather crude representations of the physics and chemistry impacting trace species to their current state: comprehensive, but not complete. The evolution has included advancements in not only the level of process descriptions, but also the computational implementation, including numerical methods. As part of the NARSTO Critical Reviews, this article discusses the current strengths and weaknesses of air quality models and the modeling process. Current Eulerian models are found to represent well the primary processes impacting the evolution of trace species in most cases though some exceptions may exist. For example, sub-grid-scale processes, such as concentrated power plant plumes, are treated only approximately. It is not apparent how much such approximations affect their results and the polices based upon those results. A significant weakness has been in how investigators have addressed, and communicated, such uncertainties. Studies find that major uncertainties are due to model inputs, e.g., emissions and meteorology, more so than the model itself. One of the primary weakness identified is in the modeling process, not the models. Evaluation has been limited both due to data constraints. Seldom is there ample observational data to conduct a detailed model intercomparison using consistent data (e.g., the same emissions and meteorology). Further model advancement, and development of greater confidence in the use of models, is hampered by the lack of thorough evaluation and intercomparisons. Model advances are seen in the use of new tools for extending the interpretation of model results, e.g., process and sensitivity analysis, modeling systems to facilitate their use, and extension of model capabilities, e.g., aerosol dynamics capabilities and sub-grid-scale representations. Another possible direction that is the development and widespread use of a community model acting as a platform for multiple groups and agencies to collaborate and progress more rapidly.

  9. Experimental and Numerical Simulations of Phase Transformations Occurring During Continuous Annealing of DP Steel Strips

    NASA Astrophysics Data System (ADS)

    Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej

    2016-04-01

    Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.

  10. Genetic evaluation of mastitis liability and recovery through longitudinal analysis of transition probabilities

    PubMed Central

    2012-01-01

    Background Many methods for the genetic analysis of mastitis use a cross-sectional approach, which omits information on, e.g., repeated mastitis cases during lactation, somatic cell count fluctuations, and recovery process. Acknowledging the dynamic behavior of mastitis during lactation and taking into account that there is more than one binary response variable to consider, can enhance the genetic evaluation of mastitis. Methods Genetic evaluation of mastitis was carried out by modeling the dynamic nature of somatic cell count (SCC) within the lactation. The SCC patterns were captured by modeling transition probabilities between assumed states of mastitis and non-mastitis. A widely dispersed SCC pattern generates high transition probabilities between states and vice versa. This method can model transitions to and from states of infection simultaneously, i.e. both the mastitis liability and the recovery process are considered. A multilevel discrete time survival model was applied to estimate breeding values on simulated data with different dataset sizes, mastitis frequencies, and genetic correlations. Results Correlations between estimated and simulated breeding values showed that the estimated accuracies for mastitis liability were similar to those from previously tested methods that used data of confirmed mastitis cases, while our results were based on SCC as an indicator of mastitis. In addition, unlike the other methods, our method also generates breeding values for the recovery process. Conclusions The developed method provides an effective tool for the genetic evaluation of mastitis when considering the whole disease course and will contribute to improving the genetic evaluation of udder health. PMID:22475575

  11. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  12. Memory bias in health anxiety is related to the emotional valence of health-related words.

    PubMed

    Ferguson, Eamonn; Moghaddam, Nima G; Bibby, Peter A

    2007-03-01

    A model based on the associative strength of object evaluations is tested to explain why those who score higher on health anxiety have a better memory for health-related words. Sixty participants observed health and nonhealth words. A recognition memory task followed a free recall task and finally subjects provided evaluations (emotionality, imageability, and frequency) for all the words. Hit rates for health words, d', c, and psychological response times (PRTs) for evaluations were examined using multi-level modelling (MLM) and regression. Health words had a higher hit rate, which was greater for those with higher levels of health anxiety. The higher hit rate for health words is partly mediated by the extent to which health words are evaluated as emotionally unpleasant, and this was stronger for (moderated by) those with higher levels of health anxiety. Consistent with the associative strength model, those with higher levels of health anxiety demonstrated faster PRTs when making emotional evaluations of health words compared to nonhealth words, while those lower in health anxiety were slower to evaluate health words. Emotional evaluations speed the recognition of health words for high health anxious individuals. These findings are discussed with respect to the wider literature on cognitive processes in health anxiety, automatic processing, implicit attitudes, and emotions in decision making.

  13. Using Instrument Simulators and a Satellite Database to Evaluate Microphysical Assumptions in High-Resolution Simulations of Hurricane Rita

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S. M.; Chao, Y.; Chau, A. H.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Martin, J. M.; Poulsen, W. L.; Rodriguez, E.; Stiles, B. W.; Turk, J.; Vu, Q.

    2009-12-01

    Improving forecasting of hurricane intensity remains a significant challenge for the research and operational communities. Many factors determine a tropical cyclone’s intensity. Ultimately, though, intensity is dependent on the magnitude and distribution of the latent heating that accompanies the hydrometeor production during the convective process. Hence, the microphysical processes and their representation in hurricane models are of crucial importance for accurately simulating hurricane intensity and evolution. The accurate modeling of the microphysical processes becomes increasingly important when running high-resolution models that should properly reflect the convective processes in the hurricane eyewall. There are many microphysical parameterizations available today. However, evaluating their performance and selecting the most representative ones remains a challenge. Several field campaigns were focused on collecting in situ microphysical observations to help distinguish between different modeling approaches and improve on the most promising ones. However, these point measurements cannot adequately reflect the space and time correlations characteristic of the convective processes. An alternative approach to evaluating microphysical assumptions is to use multi-parameter remote sensing observations of the 3D storm structure and evolution. In doing so, we could compare modeled to retrieved geophysical parameters. The satellite retrievals, however, carry their own uncertainty. To increase the fidelity of the microphysical evaluation results, we can use instrument simulators to produce satellite observables from the model fields and compare to the observed. This presentation will illustrate how instrument simulators can be used to discriminate between different microphysical assumptions. We will compare and contrast the members of high-resolution ensemble WRF model simulations of Hurricane Rita (2005), each member reflecting different microphysical assumptions. We will use the geophysical model fields as input to instrument simulators to produce microwave brightness temperatures and radar reflectivity at the TRMM (TMI and PR) frequencies and polarizations. We will also simulate the surface backscattering cross-section at the QuikSCAT frequency, polarizations and viewing geometry. We will use satellite observations from TRMM and QuikSCAT to determine those parameterizations that yield a realistic forecast and those parameterizations that do not. To facilitate hurricane research, we have developed the JPL Tropical Cyclone Information System (TCIS), which includes a comprehensive set of multi-sensor observations relevant to large-scale and storm-scale processes in the atmosphere and the ocean. In this presentation, we will illustrate how the TCIS can be used for hurricane research. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  14. The impact of stakeholder involvement in hospital policy decision-making: a study of the hospital's business processes.

    PubMed

    Malfait, Simon; Van Hecke, Ann; Hellings, Johan; De Bodt, Griet; Eeckloo, Kristof

    2017-02-01

    In many health care systems, strategies are currently deployed to engage patients and other stakeholders in decisions affecting hospital services. In this paper, a model for stakeholder involvement is presented and evaluated in three Flemish hospitals. In the model, a stakeholder committee advises the hospital's board of directors on themes of strategic importance. To study the internal hospital's decision processes in order to identify the impact of a stakeholder involvement committee on strategic themes in the hospital decision processes. A retrospective analysis of the decision processes was conducted in three hospitals that implemented a stakeholder committee. The analysis consisted of process and outcome evaluation. Fifteen themes were discussed in the stakeholder committees, whereof 11 resulted in a considerable change. None of these were on a strategic level. The theoretical model was not applied as initially developed, but was altered by each hospital. Consequentially, the decision processes differed between the hospitals. Despite alternation of the model, the stakeholder committee showed a meaningful impact in all hospitals on the operational level. As a result of the differences in decision processes, three factors could be identified as facilitators for success: (1) a close interaction with the board of executives, (2) the inclusion of themes with a more practical and patient-oriented nature, and (3) the elaboration of decisions on lower echelons of the organization. To effectively influence the organization's public accountability, hospitals should involve stakeholders in the decision-making process of the organization. The model of a stakeholder committee was not applied as initially developed and did not affect the strategic decision-making processes in the involved hospitals. Results show only impact at the operational level in the participating hospitals. More research is needed connecting stakeholder involvement with hospital governance.

  15. Analysis of Memory Formation during General Anesthesia (Propofol/Remifentanil) for Elective Surgery Using the Process-dissociation Procedure.

    PubMed

    Hadzidiakos, Daniel; Horn, Nadja; Degener, Roland; Buchner, Axel; Rehberg, Benno

    2009-08-01

    There have been reports of memory formation during general anesthesia. The process-dissociation procedure has been used to determine if these are controlled (explicit/conscious) or automatic (implicit/unconscious) memories. This study used the process-dissociation procedure with the original measurement model and one which corrected for guessing to determine if more accurate results were obtained in this setting. A total of 160 patients scheduled for elective surgery were enrolled. Memory for words presented during propofol and remifentanil general anesthesia was tested postoperatively by using a word-stem completion task in a process-dissociation procedure. To assign possible memory effects to different levels of anesthetic depth, the authors measured depth of anesthesia using the BIS XP monitor (Aspect Medical Systems, Norwood, MA). Word-stem completion performance showed no evidence of memory for intraoperatively presented words. Nevertheless, an evaluation of these data using the original measurement model for process-dissociation data suggested an evidence of controlled (C = 0.05; 95% confidence interval [CI] 0.02-0.08) and automatic (A = 0.11; 95% CI 0.09-0.12) memory processes (P < 0.01). However, when the data were evaluated with an extended measurement model taking base rates into account adequately, no evidence for controlled (C = 0.00; 95% CI -0.04 to 0.04) or automatic (A = 0.00; 95% CI -0.02 to 0.02) memory processes was obtained. The authors report and discuss parallel findings for published data sets that were generated by using the process-dissociation procedure. Patients had no memories for auditory information presented during propofol/remifentanil anesthesia after midazolam premedication. The use of the process-dissociation procedure with the original measurement model erroneously detected memories, whereas the extended model, corrected for guessing, correctly revealed no memory.

  16. Quantitative image quality evaluation of MR images using perceptual difference models

    PubMed Central

    Miao, Jun; Huo, Donglai; Wilson, David L.

    2008-01-01

    The authors are using a perceptual difference model (Case-PDM) to quantitatively evaluate image quality of the thousands of test images which can be created when optimizing fast magnetic resonance (MR) imaging strategies and reconstruction techniques. In this validation study, they compared human evaluation of MR images from multiple organs and from multiple image reconstruction algorithms to Case-PDM and similar models. The authors found that Case-PDM compared very favorably to human observers in double-stimulus continuous-quality scale and functional measurement theory studies over a large range of image quality. The Case-PDM threshold for nonperceptible differences in a 2-alternative forced choice study varied with the type of image under study, but was ≈1.1 for diffuse image effects, providing a rule of thumb. Ordering the image quality evaluation models, we found in overall Case-PDM ≈ IDM (Sarnoff Corporation) ≈ SSIM [Wang et al. IEEE Trans. Image Process. 13, 600–612 (2004)] > mean squared error ≈ NR [Wang et al. (2004) (unpublished)] > DCTune (NASA) > IQM (MITRE Corporation). The authors conclude that Case-PDM is very useful in MR image evaluation but that one should probably restrict studies to similar images and similar processing, normally not a limitation in image reconstruction studies. PMID:18649487

  17. Some considerations on the attractiveness of participatory processes for researchers from natural science

    NASA Astrophysics Data System (ADS)

    Barthel, Roland

    2013-04-01

    Participatory modeling and participatory scenario development have become an essential part of environmental impact assessment and planning in the field of water resources management. But even if most people agree that participation is required to solve environmental problems in a way that satisfies both the environmental and societal needs, success stories are relatively rare, while many attempts to include stakeholders in the development of models are still reported to have failed. This paper proposes the hypothesis, that the lack of success in participatory modeling can partly be attributed to a lack of attractiveness of participatory approaches for researchers from natural sciences (subsequently called 'modelers'). It has to be pointed out that this discussion is mainly concerned with natural scientists in academia and not with modelers who develop models for commercial purposes or modelers employed by public agencies. The involvement of modelers and stakeholders in participatory modeling has been intensively studied during recent years. However, such analysis is rarely made from the viewpoint of the modelers themselves. Modelers usually don't see participatory modeling and scenario development as scientific targets as such, because the theoretical foundations of such processes usually lie far outside their own area of expertise. Thus, participatory processes are seen mainly as a means to attract funding or to facilitate the access to data or (relatively rarely) as a way to develop a research model into a commercial product. The majority of modelers very likely do not spend too much time on reflecting whether or not their new tools are helpful to solve real world problems or if the results are understandable and acceptable for stakeholders. They consider their task completed when the model they developed satisfies the 'scientific requirements', which are essentially different from the requirements to satisfy a group of stakeholders. Funding often stops before a newly developed model can actually be tested in a stakeholder process. Therefore the gap between stakeholders and modelers persists or is even growing. A main reason for this probably lies in the way that the work of scientists (modelers) is evaluated. What counts is the number of journal articles produced, while applicability or societal impact is still not a measure of scientific success. A good journal article on a model requires an exemplary validation but only very rarely would a reviewer ask if a model was accepted by stakeholders. So why should a scientist go through a tedious stakeholder process? The stakeholder process might be a requirement of the research grant, but whether this is taken seriously, can be questioned, as long as stakeholder dialogues do not lead to quantifiable scientific success. In particular for researchers in early career stages who undergo typical, publication-based evaluation processes, participatory research is hardly beneficial. The discussion in this contribution is based on three pillars: (i) a comprehensive evaluation of the literature published on participatory modeling and scenario development, (ii) a case study involving the development of an integrated model for water and land use management including an intensive stakeholder process and (iii) unstructured, personal communication - with mainly young scientists - about the attractiveness of multidisciplinary, applied research.

  18. Evaluation of computing systems using functionals of a Stochastic process

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Wu, L. T.

    1980-01-01

    An intermediate model was used to represent the probabilistic nature of a total system at a level which is higher than the base model and thus closer to the performance variable. A class of intermediate models, which are generally referred to as functionals of a Markov process, were considered. A closed form solution of performability for the case where performance is identified with the minimum value of a functional was developed.

  19. Modeling and simulation of emergent behavior in transportation infrastructure restoration

    USGS Publications Warehouse

    Ojha, Akhilesh; Corns, Steven; Shoberg, Thomas G.; Qin, Ruwen; Long, Suzanna K.

    2018-01-01

    The objective of this chapter is to create a methodology to model the emergent behavior during a disruption in the transportation system and that calculates economic losses due to such a disruption, and to understand how an extreme event affects the road transportation network. The chapter discusses a system dynamics approach which is used to model the transportation road infrastructure system to evaluate the different factors that render road segments inoperable and calculate economic consequences of such inoperability. System dynamics models have been integrated with business process simulation model to evaluate, design, and optimize the business process. The chapter also explains how different factors affect the road capacity. After identifying the various factors affecting the available road capacity, a causal loop diagram (CLD) is created to visually represent the causes leading to a change in the available road capacity and the effects on travel costs when the available road capacity changes.

  20. Participation versus Privacy in the Training of Group Counselors.

    ERIC Educational Resources Information Center

    Pierce, Keith A.; Baldwin, Cynthia

    1990-01-01

    Examines the process of requiring and evaluating personal growth group participation for students in counselor education programs. Discusses the key components in the dilemma of protecting privacy while evaluating competencies, including ethical practices and program alternatives to avoid evaluation. Proposes a model that will enable participation…

  1. Process compensated resonance testing modeling for damage evolution and uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Biedermann, Eric; Heffernan, Julieanne; Mayes, Alexander; Gatewood, Garrett; Jauriqui, Leanne; Goodlet, Brent; Pollock, Tresa; Torbet, Chris; Aldrin, John C.; Mazdiyasni, Siamack

    2017-02-01

    Process Compensated Resonance Testing (PCRT) is a nondestructive evaluation (NDE) method based on the fundamentals of Resonant Ultrasound Spectroscopy (RUS). PCRT is used for material characterization, defect detection, process control and life monitoring of critical gas turbine engine and aircraft components. Forward modeling and model inversion for PCRT have the potential to greatly increase the method's material characterization capability while reducing its dependence on compiling a large population of physical resonance measurements. This paper presents progress on forward modeling studies for damage mechanisms and defects in common to structural materials for gas turbine engines. Finite element method (FEM) models of single crystal (SX) Ni-based superalloy Mar-M247 dog bones and Ti-6Al-4V cylindrical bars were created, and FEM modal analyses calculated the resonance frequencies for the samples in their baseline condition. Then the frequency effects of superalloy creep (high-temperature plastic deformation) and macroscopic texture (preferred crystallographic orientation of grains detrimental to fatigue properties) were evaluated. A PCRT sorting module for creep damage in Mar-M247 was trained with a virtual database made entirely of modeled design points. The sorting module demonstrated successful discrimination of design points with as little as 1% creep strain in the gauge section from a population of acceptable design points with a range of material and geometric variation. The resonance frequency effects of macro-scale texture in Ti-6Al-4V were quantified with forward models of cylinder samples. FEM-based model inversion was demonstrated for Mar-M247 bulk material properties and variations in crystallographic orientation. PCRT uncertainty quantification (UQ) was performed using Monte Carlo studies for Mar-M247 that quantified the overall uncertainty in resonance frequencies resulting from coupled variation in geometry, material properties, crystallographic orientation and creep damage. A model calibration process was also developed that evaluates inversion fitting to differences from a designated reference sample rather than absolute property values, yielding a reduction in fit error.

  2. Estimating inelastic heavy-particle-hydrogen collision data. I. Simplified model and application to potassium-hydrogen collisions

    NASA Astrophysics Data System (ADS)

    Belyaev, Andrey K.; Yakovleva, Svetlana A.

    2017-10-01

    Aims: We derive a simplified model for estimating atomic data on inelastic processes in low-energy collisions of heavy-particles with hydrogen, in particular for the inelastic processes with high and moderate rate coefficients. It is known that these processes are important for non-LTE modeling of cool stellar atmospheres. Methods: Rate coefficients are evaluated using a derived method, which is a simplified version of a recently proposed approach based on the asymptotic method for electronic structure calculations and the Landau-Zener model for nonadiabatic transition probability determination. Results: The rate coefficients are found to be expressed via statistical probabilities and reduced rate coefficients. It turns out that the reduced rate coefficients for mutual neutralization and ion-pair formation processes depend on single electronic bound energies of an atom, while the reduced rate coefficients for excitation and de-excitation processes depend on two electronic bound energies. The reduced rate coefficients are calculated and tabulated as functions of electronic bound energies. The derived model is applied to potassium-hydrogen collisions. For the first time, rate coefficients are evaluated for inelastic processes in K+H and K++H- collisions for all transitions from ground states up to and including ionic states. Tables with calculated data are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/606/A147

  3. Modeling the defrost process in complex geometries - Part 1: Development of a one-dimensional defrost model

    NASA Astrophysics Data System (ADS)

    van Buren, Simon; Hertle, Ellen; Figueiredo, Patric; Kneer, Reinhold; Rohlfs, Wilko

    2017-11-01

    Frost formation is a common, often undesired phenomenon in heat exchanges such as air coolers. Thus, air coolers have to be defrosted periodically, causing significant energy consumption. For the design and optimization, prediction of defrosting by a CFD tool is desired. This paper presents a one-dimensional transient model approach suitable to be used as a zero-dimensional wall-function in CFD for modeling the defrost process at the fin and tube interfaces. In accordance to previous work a multi stage defrost model is introduced (e.g. [1, 2]). In the first instance the multi stage model is implemented and validated using MATLAB. The defrost process of a one-dimensional frost segment is investigated. Fixed boundary conditions are provided at the frost interfaces. The simulation results verify the plausibility of the designed model. The evaluation of the simulated defrost process shows the expected convergent behavior of the three-stage sequence.

  4. Simulation and prediction of the thuringiensin abiotic degradation processes in aqueous solution by a radius basis function neural network model.

    PubMed

    Zhou, Jingwen; Xu, Zhenghong; Chen, Shouwen

    2013-04-01

    The thuringiensin abiotic degradation processes in aqueous solution under different conditions, with a pH range of 5.0-9.0 and a temperature range of 10-40°C, were systematically investigated by an exponential decay model and a radius basis function (RBF) neural network model, respectively. The half-lives of thuringiensin calculated by the exponential decay model ranged from 2.72 d to 16.19 d under the different conditions mentioned above. Furthermore, an RBF model with accuracy of 0.1 and SPREAD value 5 was employed to model the degradation processes. The results showed that the model could simulate and predict the degradation processes well. Both the half-lives and the prediction data showed that thuringiensin was an easily degradable antibiotic, which could be an important factor in the evaluation of its safety. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Towards improved and more routine Earth system model evaluation in CMIP

    DOE PAGES

    Eyring, Veronika; Gleckler, Peter J.; Heinze, Christoph; ...

    2016-11-01

    The Coupled Model Intercomparison Project (CMIP) has successfully provided the climate community with a rich collection of simulation output from Earth system models (ESMs) that can be used to understand past climate changes and make projections and uncertainty estimates of the future. Confidence in ESMs can be gained because the models are based on physical principles and reproduce many important aspects of observed climate. More research is required to identify the processes that are most responsible for systematic biases and the magnitude and uncertainty of future projections so that more relevant performance tests can be developed. At the same time,more » there are many aspects of ESM evaluation that are well established and considered an essential part of systematic evaluation but have been implemented ad hoc with little community coordination. Given the diversity and complexity of ESM analysis, we argue that the CMIP community has reached a critical juncture at which many baseline aspects of model evaluation need to be performed much more efficiently and consistently. We provide a perspective and viewpoint on how a more systematic, open, and rapid performance assessment of the large and diverse number of models that will participate in current and future phases of CMIP can be achieved, and announce our intention to implement such a system for CMIP6. Accomplishing this could also free up valuable resources as many scientists are frequently "re-inventing the wheel" by re-writing analysis routines for well-established analysis methods. A more systematic approach for the community would be to develop and apply evaluation tools that are based on the latest scientific knowledge and observational reference, are well suited for routine use, and provide a wide range of diagnostics and performance metrics that comprehensively characterize model behaviour as soon as the output is published to the Earth System Grid Federation (ESGF). The CMIP infrastructure enforces data standards and conventions for model output and documentation accessible via the ESGF, additionally publishing observations (obs4MIPs) and reanalyses (ana4MIPs) for model intercomparison projects using the same data structure and organization as the ESM output. This largely facilitates routine evaluation of the ESMs, but to be able to process the data automatically alongside the ESGF, the infrastructure needs to be extended with processing capabilities at the ESGF data nodes where the evaluation tools can be executed on a routine basis. Efforts are already underway to develop community-based evaluation tools, and we encourage experts to provide additional diagnostic codes that would enhance this capability for CMIP. And, at the same time, we encourage the community to contribute observations and reanalyses for model evaluation to the obs4MIPs and ana4MIPs archives. The intention is to produce through the ESGF a widely accepted quasi-operational evaluation framework for CMIP6 that would routinely execute a series of standardized evaluation tasks. Over time, as this capability matures, we expect to produce an increasingly systematic characterization of models which, compared with early phases of CMIP, will more quickly and openly identify the strengths and weaknesses of the simulations. This will also reveal whether long-standing model errors remain evident in newer models and will assist modelling groups in improving their models. Finally, this framework will be designed to readily incorporate updates, including new observations and additional diagnostics and metrics as they become available from the research community.« less

  6. Modelling a model?!! Prediction of observed and calculated daily pan evaporation in New Mexico, U.S.A.

    NASA Astrophysics Data System (ADS)

    Beriro, D. J.; Abrahart, R. J.; Nathanail, C. P.

    2012-04-01

    Data-driven modelling is most commonly used to develop predictive models that will simulate natural processes. This paper, in contrast, uses Gene Expression Programming (GEP) to construct two alternative models of different pan evaporation estimations by means of symbolic regression: a simulator, a model of a real-world process developed on observed records, and an emulator, an imitator of some other model developed on predicted outputs calculated by that source model. The solutions are compared and contrasted for the purposes of determining whether any substantial differences exist between either option. This analysis will address recent arguments over the impact of using downloaded hydrological modelling datasets originating from different initial sources i.e. observed or calculated. These differences can be easily be overlooked by modellers, resulting in a model of a model developed on estimations derived from deterministic empirical equations and producing exceptionally high goodness-of-fit. This paper uses different lines-of-evidence to evaluate model output and in so doing paves the way for a new protocol in machine learning applications. Transparent modelling tools such as symbolic regression offer huge potential for explaining stochastic processes, however, the basic tenets of data quality and recourse to first principles with regard to problem understanding should not be trivialised. GEP is found to be an effective tool for the prediction of observed and calculated pan evaporation, with results supported by an understanding of the records, and of the natural processes concerned, evaluated using one-at-a-time response function sensitivity analysis. The results show that both architectures and response functions are very similar, implying that previously observed differences in goodness-of-fit can be explained by whether models are applied to observed or calculated data.

  7. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  8. Evolution of evaluation criteria in the College of American Pathologists Surveys.

    PubMed

    Ross, J W

    1988-04-01

    This review of the evolution of evaluation criteria in the College of American Pathologists Survey and of theoretical grounds proposed for evaluation criteria explores the complex nature of the evaluation process. Survey professionals balance multiple variables to seek relevant and meaningful evaluations. These include the state of the art, the reliability of target values, the nature of available control materials, the perceived medical "nonusefulness" of the extremes of performance (good or poor), this extent of laboratory services provided, and the availability of scientific data and theory by which clinically relevant criteria of medical usefulness may be established. The evaluation process has consistently sought peer concensus, to stimulate improvement in state of the art, to increase medical usefulness, and to monitor the state of the art. Recent factors that are likely to promote change from peer group evaluation to fixed criteria evaluation are the high degree of proficiency in the state of the art for many analytes, accurate target values, increased knowledge of biologic variation, and the availability of statistical modeling techniques simulating biologic and diagnostic processes as well as analytic processes.

  9. An empirical model of water quality for use in rapid management strategy evaluation in Southeast Queensland, Australia.

    PubMed

    de la Mare, William; Ellis, Nick; Pascual, Ricardo; Tickell, Sharon

    2012-04-01

    Simulation models have been widely adopted in fisheries for management strategy evaluation (MSE). However, in catchment management of water quality, MSE is hampered by the complexity of both decision space and the hydrological process models. Empirical models based on monitoring data provide a feasible alternative to process models; they run much faster and, by conditioning on data, they can simulate realistic responses to management actions. Using 10 years of water quality indicators from Queensland, Australia, we built an empirical model suitable for rapid MSE that reproduces the water quality variables' mean and covariance structure, adjusts the expected indicators through local management effects, and propagates effects downstream by capturing inter-site regression relationships. Empirical models enable managers to search the space of possible strategies using rapid assessment. They provide not only realistic responses in water quality indicators but also variability in those indicators, allowing managers to assess strategies in an uncertain world. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Effect of warning placement on the information processing of college students reading an OTC drug facts panel.

    PubMed

    Bhansali, Archita H; Sangani, Darshan S; Mhatre, Shivani K; Sansgiry, Sujit S

    2018-01-01

    To compare three over-the-counter (OTC) Drug Facts panel versions for information processing optimization among college students. University of Houston students (N = 210) participated in a cross-sectional survey from January to May 2010. A current FDA label was compared to two experimental labels developed using the theory of CHREST to test information processing by re-positioning the warning information within the Drug Facts panel. Congruency was defined as placing like information together. Information processing was evaluated using the OTC medication Label Evaluation Process Model (LEPM): label comprehension, ease-of-use, attitude toward the product, product evaluation, and purchase intention. Experimental label with chunked congruent information (uses-directions-other information-warnings) was rated significantly higher than the current FDA label and had the best average scores among the LEPM information processing variables. If replications uphold these findings, the FDA label design might be revised to improve information processing.

  11. Managing fear in public health campaigns: a theory-based formative evaluation process.

    PubMed

    Cho, Hyunyi; Witte, Kim

    2005-10-01

    The HIV/AIDS infection rate of Ethiopia is one of the world's highest. Prevention campaigns should systematically incorporate and respond to at-risk population's existing beliefs, emotions, and perceived barriers in the message design process to effectively promote behavior change. However, guidelines for conducting formative evaluation that are grounded in proven risk communication theory and empirical data analysis techniques are hard to find. This article provides a five-step formative evaluation process that translates theory and research for developing effective messages for behavior change. Guided by the extended parallel process model, the five-step process helps message designers manage public's fear surrounding issues such as HIV/AIDS. An entertainment education project that used the process to design HIV/AIDS prevention messages for Ethiopian urban youth is reported. Data were collected in five urban regions of Ethiopia and analyzed according to the process to develop key messages for a 26-week radio soap opera.

  12. Watershed-scale evaluation of the Water Erosion Prediction Project (WEPP) model in the Lake Tahoe basin

    Treesearch

    Erin S. Brooks; Mariana Dobre; William J. Elliot; Joan Q. Wu; Jan Boll

    2016-01-01

    Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to...

  13. Wildfire potential evaluation during a drought event with a regional climate model and NDVI

    Treesearch

    Y. Liu; J. Stanturf; S. Goodrick

    2010-01-01

    Regional climate modeling is a technique for simulating high-resolution physical processes in the atmosphere, soil and vegetation. It can be used to evaluate wildfire potential by either providing meteorological conditions for computation of fire indices or predicting soil moisture as a direct measure of fire potential. This study examines these roles using a regional...

  14. Consistent Chemical Mechanism from Collaborative Data Processing

    DOE PAGES

    Slavinskaya, Nadezda; Starcke, Jan-Hendrik; Abbasi, Mehdi; ...

    2016-04-01

    Numerical tool of Process Informatics Model (PrIMe) is mathematically rigorous and numerically efficient approach for analysis and optimization of chemical systems. It handles heterogeneous data and is scalable to a large number of parameters. The Boundto-Bound Data Collaboration module of the automated data-centric infrastructure of PrIMe was used for the systematic uncertainty and data consistency analyses of the H 2/CO reaction model (73/17) and 94 experimental targets (ignition delay times). The empirical rule for evaluation of the shock tube experimental data is proposed. The initial results demonstrate clear benefits of the PrIMe methods for an evaluation of the kinetic datamore » quality and data consistency and for developing predictive kinetic models.« less

  15. A preliminary evaluation of an F100 engine parameter estimation process using flight data

    NASA Technical Reports Server (NTRS)

    Maine, Trindel A.; Gilyard, Glenn B.; Lambert, Heather H.

    1990-01-01

    The parameter estimation algorithm developed for the F100 engine is described. The algorithm is a two-step process. The first step consists of a Kalman filter estimation of five deterioration parameters, which model the off-nominal behavior of the engine during flight. The second step is based on a simplified steady-state model of the compact engine model (CEM). In this step, the control vector in the CEM is augmented by the deterioration parameters estimated in the first step. The results of an evaluation made using flight data from the F-15 aircraft are presented, indicating that the algorithm can provide reasonable estimates of engine variables for an advanced propulsion control law development.

  16. A preliminary evaluation of an F100 engine parameter estimation process using flight data

    NASA Technical Reports Server (NTRS)

    Maine, Trindel A.; Gilyard, Glenn B.; Lambert, Heather H.

    1990-01-01

    The parameter estimation algorithm developed for the F100 engine is described. The algorithm is a two-step process. The first step consists of a Kalman filter estimation of five deterioration parameters, which model the off-nominal behavior of the engine during flight. The second step is based on a simplified steady-state model of the 'compact engine model' (CEM). In this step the control vector in the CEM is augmented by the deterioration parameters estimated in the first step. The results of an evaluation made using flight data from the F-15 aircraft are presented, indicating that the algorithm can provide reasonable estimates of engine variables for an advanced propulsion-control-law development.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szoka de Valladares, M.R.; Mack, S.

    The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less

  18. A systematic review of Markov models evaluating multicomponent disease management programs in diabetes.

    PubMed

    Kirsch, Florian

    2015-01-01

    Diabetes is the most expensive chronic disease; therefore, disease management programs (DMPs) were introduced. The aim of this review is to determine whether Markov models are adequate to evaluate the cost-effectiveness of complex interventions such as DMPs. Additionally, the quality of the models was evaluated using Philips and Caro quality appraisals. The five reviewed models incorporated the DMP into the model differently: two models integrated effectiveness rates derived from one clinical trial/meta-analysis and three models combined interventions from different sources into a DMP. The results range from cost savings and a QALY gain to costs of US$85,087 per QALY. The Spearman's rank coefficient assesses no correlation between the quality appraisals. With restrictions to the data selection process, Markov models are adequate to determine the cost-effectiveness of DMPs; however, to allow prioritization of medical services, more flexibility in the models is necessary to enable the evaluation of single additional interventions.

  19. Development and Application of a Life Cycle-Based Model to Evaluate Greenhouse Gas Emissions of Oil Sands Upgrading Technologies.

    PubMed

    Pacheco, Diana M; Bergerson, Joule A; Alvarez-Majmutov, Anton; Chen, Jinwen; MacLean, Heather L

    2016-12-20

    A life cycle-based model, OSTUM (Oil Sands Technologies for Upgrading Model), which evaluates the energy intensity and greenhouse gas (GHG) emissions of current oil sands upgrading technologies, is developed. Upgrading converts oil sands bitumen into high quality synthetic crude oil (SCO), a refinery feedstock. OSTUM's novel attributes include the following: the breadth of technologies and upgrading operations options that can be analyzed, energy intensity and GHG emissions being estimated at the process unit level, it not being dependent on a proprietary process simulator, and use of publicly available data. OSTUM is applied to a hypothetical, but realistic, upgrading operation based on delayed coking, the most common upgrading technology, resulting in emissions of 328 kg CO 2 e/m 3 SCO. The primary contributor to upgrading emissions (45%) is the use of natural gas for hydrogen production through steam methane reforming, followed by the use of natural gas as fuel in the rest of the process units' heaters (39%). OSTUM's results are in agreement with those of a process simulation model developed by CanmetENERGY, other literature, and confidential data of a commercial upgrading operation. For the application of the model, emissions are found to be most sensitive to the amount of natural gas utilized as feedstock by the steam methane reformer. OSTUM is capable of evaluating the impact of different technologies, feedstock qualities, operating conditions, and fuel mixes on upgrading emissions, and its life cycle perspective allows easy incorporation of results into well-to-wheel analyses.

  20. Evaluating CONUS-Scale Runoff Simulation across the National Water Model WRF-Hydro Implementation to Disentangle Regional Controls on Streamflow Generation and Model Error Contribution

    NASA Astrophysics Data System (ADS)

    Dugger, A. L.; Rafieeinasab, A.; Gochis, D.; Yu, W.; McCreight, J. L.; Karsten, L. R.; Pan, L.; Zhang, Y.; Sampson, K. M.; Cosgrove, B.

    2016-12-01

    Evaluation of physically-based hydrologic models applied across large regions can provide insight into dominant controls on runoff generation and how these controls vary based on climatic, biological, and geophysical setting. To make this leap, however, we need to combine knowledge of regional forcing skill, model parameter and physics assumptions, and hydrologic theory. If we can successfully do this, we also gain information on how well our current approximations of these dominant physical processes are represented in continental-scale models. In this study, we apply this diagnostic approach to a 5-year retrospective implementation of the WRF-Hydro community model configured for the U.S. National Weather Service's National Water Model (NWM). The NWM is a water prediction model in operations over the contiguous U.S. as of summer 2016, providing real-time estimates and forecasts out to 30 days of streamflow across 2.7 million stream reaches as well as distributed snowpack, soil moisture, and evapotranspiration at 1-km resolution. The WRF-Hydro system permits not only the standard simulation of vertical energy and water fluxes common in continental-scale models, but augments these processes with lateral redistribution of surface and subsurface water, simple groundwater dynamics, and channel routing. We evaluate 5 years of NLDAS-2 precipitation forcing and WRF-Hydro streamflow and evapotranspiration simulation across the contiguous U.S. at a range of spatial (gage, basin, ecoregion) and temporal (hourly, daily, monthly) scales and look for consistencies and inconsistencies in performance in terms of bias, timing, and extremes. Leveraging results from other CONUS-scale hydrologic evaluation studies, we translate our performance metrics into a matrix of likely dominant process controls and error sources (forcings, parameter estimates, and model physics). We test our hypotheses in a series of controlled model experiments on a subset of representative basins from distinct "problem" environments (Southeast U.S. Coastal Plain, Central and Coastal Texas, Northern Plains, and Arid Southwest). The results from these longer-term model diagnostics will inform future improvements in forcing bias correction, parameter calibration, and physics developments in the National Water Model.

Top